A majority of Britons and Americans think it is important for brands to have a clear viewpoint on social issues
How important, or not, is it to you that the
brands you like have a clear / transparent point
of view on wider issues in society? (% of GB and US adults)