What industries do you admire for their honesty and forthrightness? I'm not talking about specific companies, I mean businesses in general, like banks, insurance, autos, investments, oil, credit cards, computers, airlines, health care, defense. Why? What do they do to make our country, society and world a better place? How do they treat their employees? How do they influence our government? What effects do they have on the environment, the marketplace, etc?