answersLogoWhite

0


Best Answer

The Americans, along with the other Allies, invaded France in World War 2 because it was a German strong point, and not only was it the only real place to land in Europe, but it would have to be taken eventualy to win the war.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

15y ago

. They fought the French because they both had a grudge against each other and the French kept sending troops to find information about the US and then it began. No.... The US forces were better received in North Africa than the British, there was some shooting as an act of defiance at troops landing on French soil and then there was some shooting in earnest and people got killed even though they were landing to evict the Germans from Africa & so ensure that France would eventually be liberated. The US & the British did not do everything correctly in WW2: Far from it, but the role of the French in North Africa at the end of 1942 was totally deplorable.

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

Japan did not invade France in World War 2. Germany did. Japan invaded the French Colony of Indochina. The United States provided guerrilla training to Ho Chi Minh and a number of other Indochinese so they could fight the Japanese. They used that training and inflicted tremendous losses on the Japanese. At a later time, they would use that training to fight first France and then the United States.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

To liberate france and get to germany

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why did the US fight the French in world war 2?
Write your answer...
Submit
Still have questions?
magnify glass
imp