According to the reading the most important thing to understand in U.S. culture would be individualism. They are trained from childhood about being a separate individual who are responsible for their actions and consequences to those actions. And when they grow they expected to move out and live independently. Handle their own money, relationship be on their own feet. Give me brainliest