Should You Always Tell The Truth?

Canada
February 4, 2007 6:45pm CST
Although some people believe lying is bad, they also believe lying under certain circumstances is ok. I read an interesting article on the subject that shows the damage we cause to our friends, our families, our mates and even our jobs when we lie. It really made me think of the terrible consequences this can have on our relationship with others. So, please tell me what you think.
5 responses
@cultoffury (1283)
• India
5 Feb 07
No. I don't think you should ALWAYS say truth. Act depending on the situation. "A pleasent lie is better than a painful truth" at times. Some lies even change some lives. So I am the firm believer of the above motto. What do you think? I would not advice lie for wordly benifits, lying for a noble cause is not a problem, I suppose?
@Laydee83 (275)
• Atlanta, Georgia
5 Feb 07
no one is honest these days. its a shame. 98% of the world is a lie itself. From politics, to atmosphere, to society, to people, just to life. should be all be expected to tell the truth in a grimey lying world like this?
• United States
5 Feb 07
There's nothing wrong with telling a white lie every now and then. If it's going to prevent someone's feelings from getting hurt, then it's fine.
@bsabers (668)
• United States
5 Feb 07
I would say that you should always tell the truth, unless of course your life is on the line or something serious like that.
@vinoth_123 (1876)
• India
5 Feb 07
Maximun i am telling truth sometimes i am telling lie with family members i am not telling lie with my friends for jocking i can tell lie.