answersLogoWhite

0

It depends on which Allies you are talking about.

If you are talking about America, France, and England, then they treated West Germany with much respect and helped them grow as a new country. France and West Germany, along with a couple others, created the EU in 1951 with the Treaty of Paris.

Russia, though, oppressed East Germany, and many people tried to escape over the Berlin Wall, which was erected in 1961 to make sure people didn't escape.

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
FranFran
I've made my fair share of mistakes, and if I can help you avoid a few, I'd sure like to try.
Chat with Fran
CoachCoach
Success isn't just about winning—it's about vision, patience, and playing the long game.
Chat with Coach

Add your answer:

Earn +20 pts
Q: How did the allies treat Germany after World War II?
Write your answer...
Submit
Still have questions?
magnify glass
imp