Cinema explains American society. It's like a Western, with good guys and bad guys, where the weak don't have a place.
The things that have always been important: to be a good man, to try to live my life the way God would have me, to turn it over to Him that His will might be worked in my life, to do my work without looking back, to give it all I've got, and to take pride in my work as an honest performer.
But when it comes to writing the thing that I've sort of been thinking about lately, is why? You know, is it rational? Is it logical that anybody should be expected to be afraid of the work that they feel they were put on this Earth to do.