About Half The Public Believes Employers Should Be Allowed To Require COVID-19 Vaccine For Employees, Including Larger Share Of Democrats Than Republicans
Do you think employers should be allowed to require certain employees to get vaccinated for COVID-19, or is this not something employers should be allowed to do?