answersLogoWhite

0

It depends on which Allies you are talking about.

If you are talking about America, France, and England, then they treated West Germany with much respect and helped them grow as a new country. France and West Germany, along with a couple others, created the EU in 1951 with the Treaty of Paris.

Russia, though, oppressed East Germany, and many people tried to escape over the Berlin Wall, which was erected in 1961 to make sure people didn't escape.

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

RafaRafa
There's no fun in playing it safe. Why not try something a little unhinged?
Chat with Rafa
BeauBeau
You're doing better than you think!
Chat with Beau
LaoLao
The path is yours to walk; I am only here to hold up a mirror.
Chat with Lao

Add your answer:

Earn +20 pts
Q: How did the allies treat Germany after World War II?
Write your answer...
Submit
Still have questions?
magnify glass
imp