The Western United States (also called the American West, the Western States, the Far West, the Western territories, and the West) is one of the four census regions defined by the United States Census Bureau.. As American settlement in the U.S. expanded westward, the meaning of the term the West changed. Before around 1800, the crest of the Appalachian Mountains was seen as the western frontier. More @Wikipedia
Hover over any link to get a description of the article. Please note that search keywords are sometimes hidden within the full article and don't appear in the description or title.