During the interwar period, a notable shift in U.S. foreign policy regarding Japan began to emerge. They had previously established treaties with the Japanese Empire, and had notable trade links with them. However, as Japan continued to expand their empire in the Pacific relations with America swiftly declined.



from Arts-and-Entertainment Articles from EzineArticles.com http://ift.tt/10R05Lj

via IFTTT