- 112
arnguy said:Johnpet, you stated you have no idea how the concept for an employer to pay health insurance came about. Well, it started during World War II when there was a shortage of labor. Companies could not raise wages because of price controls imposed by the Feds, so what they did was add fully paid health insurance as a benefit to attract employees. After the WWII, labor came to expect this perk and unions fought hard to make sure it was included in collective bargaining agreements. This "trickled down" to small businesses. Soon all employees came to believe that employer paid health insurance was expected, even mandated (which, of course it was not and still is not).
Great thread.
Yeah, this was going to something that I was going to add to the thread...but it appears I was beat to the punch.
So...I can say..yeah..that's how it started =).