answersLogoWhite

0

Not in a strict literal sense. Japan was occupied by the United States militarily after WWII until the enactment of the San Francisco Peace Treaty in 1952. The occupying forces did make many changes in Japan, including rewriting the constitution to renounce war, but this was not strictly an occupation. Kublai Khan, the Mongol ruler of China, tried twice to invade Japan in 1274 and 1281, but both times his fleet was destroyed by a combination of bad luck, poor planning and bad weather. Both fleets were struck by powerful typhoon before they could establish a beachhead.

User Avatar

Wiki User

17y ago

What else can I help you with?

Related Questions

Who colonized Japan?

Japan was never colonized by a foreign power.


What happened after japan was colonized?

Japan was never colonized. It was occupied briefly by the US after WW II.


How long Japan colonize the Philippines?

Japan colonized us from 1941 to 1945.


Why Japan colonize Indonesia?

Japan didn't colonized Indonesia. It was the Dutch (Netherlands) who colonized the country in Indonesia.


Was Panama ever colonized?

Yes Panama was colonized by Spain.


What countries colonized Oceania?

France, Germany, Britain, USA, and Japan. French Polynesia & Tahiti are under French rule. Parts of New Guinea and part of Samoa were colonized by Germans. The British colonized Kiribati. The US colonized Guam and part of Samoa. The Japanese also colonized several islands nearer to Japan such as the Bonins.


Did Madagascar ever colonized a territory?

* no, they didnt


Was Indonesia ever colonized in 1914?

yes it was .....


Was Haiti ever colonized or invaded?

yes


What country colonized borneo?

Countries that colonized Indonesia were Nederland and Japan.


What countries oceania?

France, Germany, Britain, USA, and Japan. French Polynesia & Tahiti are under French rule. Parts of New Guinea and part of Samoa were colonized by Germans. The British colonized Kiribati. The US colonized Guam and part of Samoa. The Japanese also colonized several islands nearer to Japan such as the Bonins.


When did the United States colonized Japan?

Japan was never a colony of the United states. Japan was defeated and surendered to the US at the end of WW2.