Complete Story
 

06/10/2021

Yes, Your Employer Can Require You to Be Vaccinated

Companies can require workers entering the workplace to be vaccinated

As many Americans prepare to head back to the office, companies are hammering out policies on the extent to which they will require, or strongly encourage, employees to be vaccinated against the coronavirus.

The bottom line is that companies are legally permitted to make employees get vaccinated, according to recent guidance from the federal agency that enforces workplace discrimination laws, the U.S. Equal Employment Opportunity Commission.

Here's the latest about the rules in the United States on vaccinations in the workplace.

Please select this link to read the complete article from The New York Times.

Printer-Friendly Version