1941 in the United States
List of events / From Wikipedia, the free encyclopedia
Dear Wikiwand AI, let's keep it short by simply answering these key questions:
Can you list the top facts and stats about 1941 in the United States?
Summarize this article for a 10 year old
SHOW ALL QUESTIONS
Events from the year 1941 in the United States. At the end of this year, the United States enters World War II by declaring war on the Empire of Japan following the attack on Pearl Harbor.
This article needs additional citations for verification. (August 2021) |