I understand the basic reason why
guns are an inherent part of the American culture. The constitution secured the right to bear firearms in a time of revolution, and once that was over, the United States was an enormous country that required westward movement into dangerous territories filled with largely-unfriendly Native Americans. The love of guns has never really gone away since those days.
Nevertheless, I'd like to hear why people think guns are still such an important part of the culture. Is it a safety issue? Do people enjoy the thrill of owning a firearm? Is it still strictly for self-defence? Do economics play a bigger role than we might think? Let's hear some opinions.