American society was formed from violence, and it has remained violent ever since. From settlement to independence to nationhood, the United States has required force to build its institutions. Americans have always owned guns and used them to project strength and vigor—think of the gun-slinging cowboy or the gun-carrying lawman immortalized in Hollywood.