The Norfolk and Norwich Christian community website

Why theology is more complex than science

JamesKnight300Regular Network Norwich and Norfolk columnist James Knight writes about the crossover between the miraculous and non-miraculous as he continues his series on miracles.



Is there a crossover between miraculous and non-miraculous? Given that human knowledge is always on the move and that there are many things that we can infer a posteriori without having the full explanation of how they work (like continental drifts, tectonic plates and weather systems, to name but three) our knowledge of exactly where the crossover is from the non-miraculous to the miraculous may be rather similar to stating at what point east becomes west and hot becomes cold, because our categorisations of what constitutes the ‘prosaically uniform’ are not only a problem with regard to miracles, they are a problem for science in general, as we are always on the move in some way shape or form.  Black and white category distinctions are often what atheists use to justify their scepticism, but this skews an important truth that although science is dealing largely with simple, controlled and fairly easily accessible objects, theology is not, so expecting to make simple sense and light work of a complex subject in a complex ontology is to be guilty of rash thinking. 


This does not mean that the message of salvation is complex - it is relatively simple.  But it is the withdrawing from simple truths that are a priori accessible into more convoluted scepticism that seeks to distort the simple elements of the reality of the situation up for discussion.  By doing this humans seem to move into territories where crossovers and counterfactual objections obscure the simplicity of each element of the truth, making a relatively uncomplicated salvation pursuit much more difficult.  Theology is made up of complex compounds, which naturally become cloudy or coagulated when they are treated with the wrong philosophical substances.  Attempting to make sense of one's experiential world using complementary categories of A versus B, and adopting a join the dots ontology is fine as long as one has a good understanding of the basic interworkings of nature and the socio-personal.  In the case of the former; from the laws of physics, right through to assessment of complexity and accessibility in various frames of investigation, and the logic that underpins it all.  And in the case of the latter, the complex ways in which humans interact, communicate and exchange information and share values.  Attempts to use some sort of informal scientific method on the socio-personal do not work at the best of times, as most things arrive in uncontrolled and informal conditions, so to assess theology and in particular how Christ has impacted people's lives and whether there are any miracles occurring, we are only likely to have an impact once one ceases from attempting to fit anecdotal experiences and second hand testimonies against a worldview that wants everything in bivalent principles of logic. 


This is one of the reasons why certainty over which coins are of the highest value in nature may elude us; for we mostly find that as soon as we begin to trace back the steps of our logical endeavours we reach a discontinuity that halts our epistemological access - an impenetrable wall stops our train of thought and leaves us with a logical chain consisting of only a few links.  This is largely due to the fact that the mind is one seamless entity which suggests potential for extraordinary logical continuity, but the materials and objects with which the mind is working are agents that conceal a multitude of causative secrets - we scarcely know much about the vast nexus of causality behind those few accessible objects we work with, and this is so much truer with the mind, as it only relates causatively to what is directly in front of it.  I do not mean that we can't use extraordinary fecundity in our understanding vast ranges of subjects and objects, but in any one instance there is only a finite number of dots for the mind to join - the full picture and the background nexus of causality is constructed or assembled inductively, and the degree to which our conclusions reflect reality and how accurately this is done depends on how much data we have to manage, and the algorithmic methods used to make sense of things - and this varies with each individual. 


If Christianity is true then the engine that drives our societies, our experiences, our theory-making and our social philosophies are choreographed by the active will of our sovereign all-loving God, so naturally in a world in which God permits our free choices and autonomous living reconciliation between His power and our powers of reasoning can be very difficult to define and catch if one uses a crude form of questioning that fails to a) engage in the big issues in the right way, and b) realise the vast intractability of the subject of theology.  Thus, the dualistic mentality that attempts to make clear cut distinctions between the ordinary and the miraculous is hard to manage even at the best of times. 


In biological terms we are poorly evolved mammalian creatures that have been shaped by nature’s laws, including in the most local sense our evolution which is about genetic algorithms, which are themselves patterns in space-time impressed on material elements, including most prominently our own minds.  Our minds are part of the very things that produce these mathematical patterns.  Now by some wonderful fortune or Divine agency we have ‘command’ minds that assert authority over the very patterns that define us, in fact, we engage in much more complex computational constructions when we develop our heuristics – we construct our own search algorithms at both minimal and maximal levels, from simple solutions to more complex assessments of information. 


Like I said, our active and dynamic cognition functions through many epistemological variables which have a huge number of local minima and maxima.  Most of our search heuristics involve our finding local optimums, and this is relatively straightforward; our evolution has shaped us as pattern seekers that use local optimisation methods.  You may not realise this but everything you think and do is in some way a utilisation of the universe's mathematical and computational outputs - it is just that our inputs transpose them into simple heuristics that we use in day to day routines.  A much harder task is finding the maximum or minimum of any heuristic - the minimum is all but impossible because there always exists a vast nexus of connected objects, models and agents that conceal the full logical path – so given the deep unknowns of the sub-conscious, we cannot know the full extent of socio-personal influences. 


Through our acquired knowledge of genes, species, and other taxa (all of which were classified by us, by the way, because they fit mathematical models we have created) we can construct computational algorithms which create programs that assist with phylogenetic analyses, from which our knowledge of phylogenetic trees comes - from bacteria, right through to hominids - trees illustrating relationships among various biological species that we have defined and categorised in the first place.  Thus the ‘maximum’ knowledge of biology provides a simpler heuristic than the ‘minimum’ heuristic of human behaviour, but this is not a point that is very often considered properly – and this is evidenced by the mistake people make when they think biological evolution is more complex than theology, and that the latter in comparison is a few simple logical steps.  It isn’t, and until this is realised, our enquirer will be in trouble. 


Often our brains construct models that work but that cannot be verified because we await the technology to catch up with our own innovations.  As a child playing with his toys a boy can work out all sorts of laws of space and geometry and simple mathematical formulas, which he puts into place in ways that he is largely unaware of.  A very simple example which goes right back to Kepler in the 17th century is when we fill a large container with small equal-sized spheres, such as putting snooker balls in a shoebox.  The density of our arrangement is the proportion of the volume of the shoebox that is taken up by the snooker balls, so in order to maximise the number of snooker balls in the shoebox, one must find an arrangement with the highest possible density, so that the balls are packed together as closely as possible.  Kepler's approximations showed that the random distribution of any set of spheres will achieve a density of approximately 65%.  In order to achieve a higher density the snooker balls would need to be placed in a hexagonal lattice, with the next layer placed in the lowest points above the first layer of balls.  This law is best observed in greengrocers’ shops when we observe their stacks of (nearly) spherical fruits (oranges, apples etc).  I would bet that most greengrocers do not know the Kepler conjecture, but they do not need to know the background information to act on instinct and stack oranges in the right way.  Each step of the stacking process provides choices of where to put the next layer, and the highest average density can be achieved by two methods, one is called 'cubic close packing' and the other is called 'hexagonal close packing'.  In actual fact with this simple method one can infer a deductive proposition of an infinite regular arrangement of identical spheres which would take up the greatest possible fraction of an infinite 3-dimensional space once maximum density is achieved.


Without an expansive computational model this theorem is hard to prove because it is only robust with the lattice stacking method.  Eliminating all possible irregular arrangements (that is, non-lattice arrangements) would have been difficult, particularity given that over a small enough volume a few irregular arrangements turn out to be denser than 'cubic close packing'.  Given this fact, how is one to prove Kepler's theorem without brute force methods?  The only way is using the proof by exhaustion method, in which what we are trying to prove is split into a finite number of cases, where each is proved independently of the others.  As we saw from our black swan analyses regarding miracles, search spaces are usually too vast to search, as things like miracles are not amenable to computational proof by exhaustion methods.  The difference between miracles and Kepler's hypothesis is that proof can be obtained by exhaustion, involving checking every individual case using complex computer calculations. 


This has practical relevancies in our discussion about miracles, because whether we are formulating medicinal drug designs with protein structure prediction, or using combinatorial optimisation in various algorithmic models and computational models - we are always performing cognitive searches in an attempt to optimise our abilities, in utilising function extrema for problem solving and in our joining the dots of life to produce coherent worldviews.  You may think you do not engage in much combinatorial optimisation, but plan a trip around Europe on which you visit several cities and have budgets and transport itineraries and you'll find that you use them much more than you think.  In fact, elementary things like driving to work and going around the supermarket involve basic combinatorial optimisations - and the reason this is important is to do with our understanding of the simplicity/complexity problem and how we optimise our methods when considering something as broad and complex as theology and how the interactive personality and dynamic mind of God engages with His creation.  If something as comparably tractable as protein structure prediction is hard because the number of possible protein structures is extremely large and the search strategy problematical, imagine the sort of search strategies we are dealing with when considering 'mind' itself and those vast non-mechanistic realities that the mind envelops.  The 'personality' or 'character' that encompasses even the person you know best and are most familiar with is like a seamless set of abstractions that are classified only with the help of language, semiotics and other forms of somatic communications.


At the base level atheism has to admit to a stronger affinity to elementalism than Christianity, due to its adherence to Humean reductionism and the post-Enlightenment materialism and other monist ontologies that were ushered in by people as far back as Democritus, Epicurus and Lucretius.  It may be true that there are not many conventional materialists left today, but one can’t get away from the fact that naturalism uses logical and exploratory heuristics which axiomatically regard the prime origins of the physical world as elementary objects with little intrinsic complexity.  And as already discussed, not only does this present huge problems given that nature cannot be compressed to absolute zero (strongly suggesting that the prime origin is to be found at the upper end of the complexity levels), but just as important at a local level naturalists seem to disregard the possibility of logical hiatuses by positing a simple bit by bit controlled trail of logic that either has everything explained away with simplistic models, or hangs to the view that time will explain away these mysteries but that one’s personal reality is not supervenient to something much more expansive; a reality that explodes with potential far beyond ordinary day to day things. 


If for the Christian these events must be defined as out of the ordinary by classifying them as God invading the ordinary human logic and supplementing the ordinary, then the term ‘miraculous’ may well be applicable as long as we are sure about what we are doing in positing something beyond the headlights of science and philosophy, and not just to be arcane and mysterious or to tantalise, but rather to offer the naturalists an ontological carrot that they cannot hope to reach unless they come over to our side.


Perhaps a good starting point is this.  Just as we recognise that even at the quantum physical level the natural order only holds good because of the host of preconditions that underwrite nature, the socio-personal only holds good because of a series of preconditions – love, grace, generosity, solicitude, the golden rule, forgiveness, evidence-based rationale, hard work - and that the Bible makes as good a case as any for strict adherence to these best principles.  If Christ is the fullness of God in human form then it is likely that His words in the Bible were not just one-off dictums that were relevant for a time but, in fact, the foundational logic from which these ‘invasions’ occur – the logos or ‘truth’ through which all other truths filtrate into the socio-personal.  This shows clearly why the reductive trail of elementary logic just won’t do here - if Christ is God then revelation is as seamless as the seamlessness of our own mind; our conscious mode of existence is dependent on the principles of logos and human cognition would be entirely debased without it.  He is the Vine and we are the branches.


Epistemological limitations

EarthFromSpaceI would like now to remind you of a point I made in part two of this series - a point very relevant to our knowledge of science and how it continues to increase and confound our views about what could have once been thought of as miraculous:


“Imagine if we took a two-way radio transceiver back to the time when Maxwell was developing his theory of electromagnetic waves; most of the lay people may well think they were relying on some sort of supernatural ‘wavelength’ because however hard they pondered radio transceivers they would probably exhaust their list of naturalistic explanations.  Maxwell, however, being on the cusp of a great discovery, and with much groundwork already in place, would likely possess the acumen to investigate whether the two devices were using electromagnetic waves between them to carry the voices.  Atheists rightly criticise arguments from ignorance, and hasty ‘God of the gaps postulations, so even if one could find some conflation that merges ‘science’ and the ‘miraculous’, one might justifiably contend that as time progresses the latter will continue giving way to the former, leaving us with science and further exploratory vistas that must by definition also be classified as science lest the same problem arises again.”


Given that epistemologically we are always on the move, how can we ever affirm a concrete concept of something being ‘miraculous’ when future scientific discoveries may confound those affirmations?  Let us say that the total possible knowledge of the cosmos for super sentience (God’s knowledge) is represented by Ω (Omega, that is the total multiplicity of the universal system).  It is probable that the most we can ever know about the universe is Ω minus ω (Omicron, that is 'small' in contrast to Omega).  In other words, clearly humans will not be able to know everything about the universe, so even if we measure the total cumulative knowledge acquired by human methods when kingdom comes or just before human life ceases to exist we would arrive at a figure way short of Ω.  Given that some miracles clearly do contradict our present understanding of nature’s laws it seems that if they are true there are two likely possibilities.  Either their 'one-offness' is somehow embedded in the vastly complex cosmological algorithms that God instantiated in nature's original blueprint (the ω that we cannot know because we are bound by our epistemological limitations - much of which is cause and effect limitations), or the other alternative is that we can potentially know a much higher figure of the ordinary workings of the cosmos but that God’s one-off acts are beyond our analysis because they involve ‘actions’ from an exogenous source.  In other words, using my model, the higher value coins are either in nature all the time but hidden in ω, or are they dropped in at the behest of God and are not really part of nature’s inherent blueprint at all.   


The problem we have is less about understanding how God might interact with nature (although that is for sure beyond our understanding) – the problem is that we will probably never get close to understanding all the subsets of reality, so our knowledge will always fall short.  If we have a wholly deterministic physics where the universe assents to physical laws in a closed system, or if alternatively we have a more ‘open’ physical regime where state vector random jumps occur and are actually completely ‘random’ in the sense of being discontinuous, we may never be availed or apprised of the knowledge of such a distinction.  With the deterministic view the randomness of quantum theory is only apparent, it is not absolute; it is the discontinuity between the knotty interwoven fabrics of quantum systems and the macroscopic methods of measurement that lead to the apparent random changes. 


With quantum jumps and absolute randomness (for example, with a completely random change of an electron from one quantum state to another within an atom), the electron makes discontinuous jumps from differing energy level states after a state of superposition, and this certainly contradicts classical mechanical theories, where the energy levels are deterministic by being continuous.  Moreover, any thoughts that the calculus of time derivatives, motion, force and acceleration could define all the physical interactions in nature has long since been replaced.  The trouble is, as I said, we may never acquire a full understanding where we can measure a system for absolute randomness, and this means that a state vector where quantum randomness is absolute or a state vector where quantum randomness is apparent (but in actual fact deterministic) will give the same appearance to us, rather like in macroevolution where the whole history of natural selection would likely appear the same to us when we observe genetic algorithms whether the process had been Divinely guided or not. 


One must remember that measurement on a quantum state is to us a probability distribution, which of course, like all probabilities is subjectively bound up in knowledge, so measurement even on a pure quantum state is only determined probabilistically and not absolutely.  What this means is that the seemingly contradictory mathematical descriptions of deterministic unitary time evolution - principally the Schrödinger equation, Heisenberg's matrix mechanics, Feynman's path integrals, and the stochastic (random) wavefunction collapse - are not contradictory when our physical intuition is applied in the right way (with macrostates).  But in the unphysical system of microstates (for example the isolated atom in Schrödinger's cat) orthodox quantum mechanics entails a paradox, and must be rectified with quantum-probabilistic frameworks where the classical Kolmogorov probability model is present in main structures of quantum theory (including Born's rule and Hilbert state space). 


Having said all that, a robust and conclusive theory of everything won't herald the end of theoretical physics, the universe is endowed with too many 'possibility' states for a complete cessation of exploration.  Moreover we would need to hark back to the Plank constant beginnings to be sure that there aren't more than the four fundamental forces governing our universe. 


Furthermore, supersymmetric theory contends a working eleven dimensions in the universe, which has low-entropy matter content and places a more determined set of interactions on the physical regime, thus superseding quantum mechanics as we know it, and this does envelop our collective understanding to the point of allowing 'size of the string' frameworks which could potentially save us from having to tender hypotheses relating to Planck-size quantities, because our knowledge would be framed on vibrational rates of one dimensional strings with approximations accurate and consistent with the Planck length.  Given the extra dimensions we are dealing with and the fact that these one-dimensional strings are likely to be infinitely small building-blocks consisting of only dimension of length, not dimensions of height and width, it is difficult to get a purchase on how strings that are in effect one dimensional slices of a two dimensional membrane vibrate in eleven dimensional space.  However much we can develop this theory towards completion one suspects that Max Born was right in that descriptions of nature will essentially remain probabilistic, and that the probability of any event will be related to the square of the amplitude of the wave function related to it. (http://en.wikipedia.org/wiki/Born%27s_Rule))


Varying degrees of interpretation as we explore different contexts of the universe’s potential, and the likely inability to scour extra spatial dimensions, will likely see to it that we will always be exploring facets of reality - our investigative work will never be complete. 


Einstein was never fully satisfied with apparently probabilistic description of nature embodied by the quantum theory - but it turns out that God's throwing of the dice is as much about our own epistemological limitations as it is about absolute randomness in the universe, so the chances are if miracles are some form of God frontloading the dice they will likely be undetectable in the background of randomness anyway.  Even at a cosmic level our notions of logic, space and time will undoubtedly require some revision to accommodate a unifying theory that brings together quantum theory and relativity, or supersedes them both with a departure.  Moreover, our cognitive framework is bound up in a continuum model for space-time, with most of our knowledge and our theorising adapted in a framework that supports bivalence principles (sorites-type paradoxes notwithstanding), so this strongly compounds my view that nature will be richer and more varied than our human minds can get to grips with - in fact, we may never know what we could never know or the extent to which we were unapprised until kingdom comes. 


Where miracles are concerned the 'continuum fallacy' demonstrates that in many cases there exists a continuum of states between certain propositions A and B (such as, say, hot and cold) and although with quantum physics notions of continuous length break down at the Planck length rendering the physical regime a nexus of many discrete states rather than continua, human minds are bivalent as they seek to create category distinctions - true and false, happy and unhappy, large and small, fat and thin, simple and complex, soft and hard, and most importantly in this case, miraculous or non-miraculous (or natural and supernatural). 


Given the foregoing observations, the demarcation line between what is miraculous and non-miraculous may be hard to define at times.  What about if God heals flu quicker than expected, to make you well for an important job interview?  How does that compare to when a sore throat heals, given that we have the physiology and evolutionary history where the body heals itself?  This is rather like deciding if a room is hot or cold, as soon as we start to make category distinctions we find a lot of in between states where the room is neither hot nor cold, as temperature is not amenable to clear bivalent logic.  And given our evolutionarily acquired immune system and bodily regeneration, unless healings are of an extraordinary nature that contradict the laws of physiology (as many do) our ability to define them as miraculous is not easy as our algorithmic approach to cognition through which we compute bivalent rules of identification to achieve solid conclusions sometimes has to give way to a more vague and reflective introspection. 


In fact our hot and cold example is relevant here, as so much of our world remains vaguely introspective due to the fact that sum total results are much more relevant in theology than specific parts (for example a person’s ultimate salvation and moral outlook is more important than individual instances of hardship and personal aberrations).  Similarly at the level of individual molecules temperature does not exist, it is only in collections of molecules that temperature becomes apparent - where the motions of random submicroscopic vibrations of the particle comprises the kinetic energy in a substance. It is the translational motion of fundamental particles of nature that gives a substance its temperature, and we identify the thermodynamic temperature of any bulk quantity of matter by the measuring of the average kinetic energy of constituent particles - these are our translational motions, whereby particles are moving, crashing and exchanging energy through collisions in three-dimensional space – and without which we would have no concept of hot and cold.  This analysis of how our concept of hot and cold works has huge practical relevance in our assessment of miracles in nature, the individual events may be instantiated in nature as single molecules are in the above analogy, the ‘temperature’ may well be what nature is as a whole.  Therefore returning to my coin analogy and our admission regarding epistemological limitations with regard to randomness, nature may not permit us to know exactly which coins are the high value coins, she may be too ‘queer’ for such valuations to take place. 


Proof and miracles

Although St Paul made it clear that God’s invisible qualities are clearly seen in nature and how human beings are affected by Him, I would agree with Kant that both science and theology are mutually beneficial when kept apart from each other, rather like electricity and water.  It is the atheists who are most strident in their insistences that theology be kept away from science, but ironically, I think the Christian has more to gain from their separation and ought to try just as hard to see that both subjects retain their qualitative values by not encroaching on one another’s magisterium.  Nature is a mystery that’s been well worth exploring, and continues to be so, but God and the miracles He provides are probably best thought about in relation to the ‘story’ being told through nature and one’s own life rather than their significance at a mechanical level.  Scientific analysis may well set us on a search for something we can never find in the physical regime, and thus take our eyes off their real significance of the greater narrative.  This shouldn’t be all that surprising really – after all, the mind is quite capable of conceiving of many things that are well beyond the headlights of science – truths that are never reified but remain abstract conceptualisations of the reality we perceive; examples of which are with things like ‘society’ or the spirit or atmosphere of the local pub or the marketplace, we don’t really prove them but we can identify with their conceptualisation.  The most we could do in an attempt to prove them is check brain states and run in our minds a logic trail of historical memories and conceptions that make up the constituent antecedents of these ‘concepts’ – but the trail would not get very far before it reached a hiatus. 


If God’s influence is running through the historical narrative of nature, and creation ex nihilo cannot occur without His blueprint design, then instead of science forever being at odds with ‘god of the gaps’ contentions, we ought to concede that God’s sustaining hand is ever present both as an engineer in nature but also as a narrator of the cosmic story.  It is also worth pointing out here that although many sceptics demand proof of the miraculous, they often don't have a clear concept of what proof is or how they would expect it to arrive.  The same is true when we talk of words like intervention; and if they haven’t clearly defined what intervention is, or how intervention might occur in an up and running cosmos, how can they be so sure that God doesn't intervene?  The question that atheists need to address much better than they have thus far is whether they have a clear conception of what proofs and miracles actually are and how, after all, ‘intervention’ may be a term too anthropomorphised – to intervene in something that God has already laid out may be only a close metaphor for what is actually happening. 


My coin illustration that has been running through this series attempts to show that the mathematical patterns we see around us and the absolute randomness of quantum mechanics ought to be seen as heuristics with no necessary obligation to always conform to the (apparent) brute necessities of Newtonian mechanics.  These brute necessities are universally imputed because they appear to be a consequence of the constraints on the physical laws, closing the system enough so degrees of order and uniformity are maintained.  At speeds relatively stationary compared with the speed of light we make use of Newtonian laws, as they are an approximation to reality that fits in with our macroscopic worldview. 


By pushing Newton’s laws of motion to extreme ends, we do not disprove them, rather we show their true limits and how they accord with our most intuitive apprehensions of reality. Aspects of reality that appear to violate Newtonian mechanics or Euclidian geometry, do not necessarily require the displacement of those laws, instead we modify our understanding and incorporate fresh principles to explain the anomalies, as we did with quantum theory when we observed where classical mechanics failed to explain subatomic interactions.


Here is a fascinating example; scientists hitting a drop of iodine cyanide and water with pulses from an ultraviolet laser excited its molecular structure into an incredibly fast spin, and heating it to vast temperatures, creating a virtual frictionless zone for itself - an activity which violated Newton's third law of motion, as for a very brief moment there was no equal but opposite reaction to an action.  How much this is classed a violation of Newton's third law is debatable, given that the destruction of the friction in the liquid around it, and the reshaping of its environment, lasted only 10-trillionths of a second before the water molecules rushed back in.  Certainly linear response theory states that this shouldn't happen, but non-equilibrium steady states (those that take us way from equilibrium) involves differentiable dynamical system under microscopic time evolutions, with varying results when related to varying perturbations of the dynamics.  The violation occurs when nonequilibrium steady states do not depend differentiably on parameters (but although this was shown with the vast spin and temperature of iodine cyanide molecule, in many other cases and with differing properties, this non-differentiability may be hard to explore experimentally).  But the cardinal point is that nature is endowed with surprises, and potentially many aspects of her may be open ended enough to throw up violations of laws, even laws as established as laws of motion. 


Once we increase our knowledge of the properties, liquids, gases, particles, etc that we are dealing with, we will probably have to modify our theories further, just as we did when it was once suggested that bumblebees were aerodynamically unsound and shouldn't be able to fly with their body and wing structure being so disproportionate.  Ironically if the wings of the bumblebee were uniformly rigid (as first thought) then it would appear that it violate the laws of physics, but it was soon discovered that its wings are much more flexible, so the error occurred because we needed to modify our understanding, not of the laws themselves, but of how bumblebee's wings work and the mass of vortexes that move them against the main current of the air.


If we can summarise a message such as this, we can see that the answer to the original question is yes, nature and our perceptions and conceptions do cross modes and domains – she if full of surprises, and given our lack of knowledge about her full quintessence, she is going to throw up plenty more surprises in the future.  Of course, for the man who is open enough, perhaps even wise enough, to combine his mental resources with an honest determination to see if there is a Mind behind this universe, there is good news – he will successfully find the truth behind every miracle, and it is this exciting finale with which I will conclude this series next week.



The views carried here are those of the author, not of Network Norwich and Norfolk, and are intended to stimulate constructive debate between website users. We welcome your thoughts and comments, posted below, upon the ideas expressed here. You can also contact the author direct at james.knight@norfolk.gov.uk  

James is a Norwich local government officer, author and Proclaimers church member in Norwich.
You can access his current collections of columns here

Meanwhile, if you want to find out more about Christianity, visit: www.rejesus.co.uk


To submit a story or to publicise an event please email: web@networknorwich.co.uk