top of page
Workers' Compensation
Workers’ compensation insurance—commonly called “workers’ comp” or “workman’s comp”—is an often-mandatory type of insurance that protects both your employees and your company if an employee experiences an injury or disease while at work, including strains, trip and falls, or accidental death. It is mandatory in most U.S. states if you have employees. Penalties and fines may be assessed on employers who do not have workers' comp insurance.
bottom of page