I can be wordy when trying to my thoughts to words, so I’ll do my best not to be wordy.
I like Christian fiction, primarily because I can be assured it won’t be full of sex and profanity.
However, the majority it seems are romance not my cup of tea, but as long as it isn’t sappy I can enjoy it. However, I find very few Christian fiction books that proclaim the Gospel.
Most never mention sin or even Jesus Christ. God is presented as a generic God who if you follow, life is mostly about happiness and peace.
My question is primarily for Christian authors,, why do your books seldom if ever proclaim the Gospel?
I can be wordy when trying to my thoughts to words, so I’ll do my best not to be wordy.
I like Christian fiction, primarily because I can be assured it won’t be full of sex and profanity.
However, the majority it seems are romance not my cup of tea, but as long as it isn’t sappy I can enjoy it. However, I find very few Christian fiction books that proclaim the Gospel.
Most never mention sin or even Jesus Christ. God is presented as a generic God who if you follow, life is mostly about happiness and peace.
My question is primarily for Christian authors,, why do your books seldom if ever proclaim the Gospel?
I’m open to other’s thoughts.
Thanks,
To Him be the glory in ALL things.
Scott aka Prison Preacher