There are so many ideas in our culture that if you experience pain, that you will end up gaining something from it. I do see truth in this, but I don't see it as a prerequisite to happiness and possitive events occurring in your life. There are many perspectives in our culture that hold (I speak as a North American) that seem to insist that pain is necessary.
The idea that you live a life of suffering so that when you die, then you get your reqard never sat very well with me. I love the idea that I am 100% responsible for my life, and that it is not in the hands of fate. This means that I dont ever have to suffer, if I don't want to. I can choose to see my life in any light I choose. It is incredibly freeing!