Right. There are legitimate businesses where the culture is one where they sell you the job. A lot of financial industry and insurance companies do this. Some require you to pay for some training, some do not. There's always exception to the rule. The key here is to be aware of the signs and weigh them out. …