I recently read a Huffington Post article about how we are “cramming religion down children’s throats.”
There are many issues with such a statement and thought. We must teach our children something. No one grows up in a vacuum. If parents are not to decide what to teach their children, then who should? Government? Politicians! My neighbors? What makes you more qualified than me to teach and train my kids? The God you don’t believe in didn’t give them to you! We must teach something; while some children may grow up in immoral circumstances, no child grows up in amoral circumstances.
Why do we teach what we teach? Who determines what is right and wrong? What do you think?