Think about it, parents are always saying you should be honest with everybody yet they lie constantly to you and those around them/you.
Then you grow up and you learn to make "white lies" and you need to be able to talk a load of bull in job interviews to get a job.
What's the point of teaching us to be honest when we spend the rest of our lives lying? Hell most people are proud when their kids lie when they get into trouble, you know the line "But I didn't do it" despite being the only person in the room at the time.
As far as im concerned if your kid doesn't do that at some point, you raised them wrong.
-
Speak for yourself...