What year did the U.S. formally enter WWII? 🔊
The United States formally entered World War II on December 7, 1941, following the surprise attack on Pearl Harbor by the Japanese military. This act galvanized public opinion and led to a declaration of war against Japan the next day. Shortly afterward, Germany and Italy declared war on the U.S. as well. America's involvement significantly altered the course of the war and contributed to the eventual defeat of the Axis powers. This entry solidified the U.S. as a major military and political player on the global stage and set the foundation for its post-war influence.
Equestions.com Team – Verified by subject-matter experts