It's a parents duty to tell their children that they must never lie, yet most of us tell them soo many little white lies over their younger years.
Snta Claus for instance and the tooth fairy and fairies at the bottom of the garden.
I know we make these up to make our children happy and excited, but is it really neccessary?
I remember not quite understanding that a few of our friends who had been very naughty over the year and who's parents were far better off than ours had very expensive toys from Santa Claus while we only had small things like colouring books, plasticine etc.
When we eventually found out at the age of seven that none of this ever existed, we were very hurt and confused to think that our parents had told us all these tales knowing that they weren't true.
I wasn't happy telling my children all these tales and making them out to be true, but because it would spoil it for other parents, I went along with it.
So is it really neccessary to carry on telling children that Santa Claus flies through the air on his reindeer and delivers presents to children or would they be just as excited receiving any gifts whoever sent ithem?