because people find death extremley scary and people need to know that there is some point to life. so in otherwords, they hide from the truth behind their faith, but really, can you blame them. how depressing, sad & pointless does life seem if we die, and that's it.
Totally, when i was a little kid i'd have chats with my Mum about what happend when we died, i had my theory that baically you go somewhere else, not prehaps somewhere akin to "heaven" but you know just somewhere, like maybe where ghost are, prehaps, a different plain of existance, so like if you die, you basically stay in your time frame so say the 20th century, and just live there as a ghost/different plain, whatever sort of thing.
Anyway my Mum bust out with (after me getting really into it about my theory) - "ohh really, i think when i die, i'll just get burnt and thrown in the garden for the plants - when you die, that's it, there's nothing after that"
I was slightly crushed at "this is it, after this there's nothing" frame of mind, but it makes sense, if you have hope of something better then it makes the shitty times worth living, i mean fuck, having a shit time alive it better then, well nothing, just ceasing to exist. . . .