History
History, 17.04.2020 18:13, morbidodyssey

What did the governments of the united states and japan agree to at the end of WWII?

A. work together to defeat Russians.

B. stop the spread of Communism in Asia.

C. work together to defeat Nazis.

D. work together to rebuild Japan.

answer
Answers: 1

Other questions on the subject: History

image
History, 21.06.2019 20:30, maria5633
African-american soldiers in world war ii sometimes felt they were ? ghting for two victories, one abroad and one in europe true or false
Answers: 1
image
History, 21.06.2019 23:10, Kenzie5755
Religious men who lived apart from society
Answers: 1
image
History, 22.06.2019 00:00, dee4648
Were african-americans allowed to serve in the military in wwi?
Answers: 2
image
History, 22.06.2019 03:20, 2Dgames
Name at least three major issues for women in indonesia
Answers: 1
Do you know the correct answer?
What did the governments of the united states and japan agree to at the end of WWII?

A....

Questions in other subjects: