Western United States photograph

Western United States

Use attributes for filter !
Date of Reg.
Date of Upd.
ID1039853
Send edit request

About Western United States


The Western United States is the region comprising the westernmost states of the United States. As European settlement in the U. S. expanded westward through the centuries, the meaning of the term the West changed. Before about 1800, the crest of the Appalachian Mountains was seen as the western frontier.

Western United States Photos

Related Persons

Next Profile ❯