Do you feel that it is important for a brand to establish a relationship with a customer? Why?/Why Not?
What are some of the pros and cons to a brand making “building customer relationships” a part of their marketing strategy?
Why is it important for a brand to pay attention to changing laws, the environment and new competitors entering the market?
Why is it more important than ever that a brand be aware of social responsibility? (Hint: Think about some companies who have failed at this)