Probability is a numerical measurement of the likelihood of an outcome of some random process. Randomness is the effect of chance and is a fundamental property of the system, even if we cannot directly measure it. It is not reducible through either study or further measurement, but may be reduced through changing the physical system. Randomness has been described as "aleatory uncertainty" and "stochastic variability". The concept of probability can be developed neatly from two different approaches:
The frequentist approach asks us to imagine repeating the physical process an extremely large number of times (trials) and then to look at the fraction of times that the outcome of interest occurs. That fraction is asymptotically (meaning as we approach an infinite number of trials) equal to the probability of that particular outcome for that physical process. So, for example, the frequentist would imagine that we toss a coin a very large number of times. The fraction of the tosses that came up heads is approximately the true probability of a single toss producing a head, and the more tosses we do the closer the fraction becomes to the true probability. So, for a fair coin, we should see the number of heads stabilize at around 50% of the trials as the number of trials gets truly huge. The philosophical problem with this approach is that one usually does not have the opportunity to repeat the scenario a very large number of times. How do we match this approach with, for example, the probability of it raining tomorrow, or you having a car crash.
The physicist or engineer, on the other hand, could look at the coin, measure it, spin it, bounce lasers off its surface, etc. until one could declare that, due to symmetry, the coin must logically have a 50% probability of falling on either surface (for a fair coin, or some other value for an unbalanced coin as the measurements dictated). Determining probabilities based on deductive reasoning has a far broader application than the frequency approach because it does not require us to imagine being able to infinitely repeat the same physical process.
A third, subjective, definition
In this context, "probability" would be our measure of how much we believe something to be true. In ModelAssist we use the term "confidence' instead of probability, to make the separation between belief and real world probability clear. A distribution of confidence looks exactly the same as a distribution of probability and must follow the same rules of complementation, addition, etc. which easily lead to mixing up the two ideas. Uncertainty is the assessor's lack of knowledge (level of ignorance) about the parameters that characterize the physical system that is being modeled. It is sometimes reducible through further measurement or study. Uncertainty has also been called "fundamental uncertainty", "epistemic uncertainty", and "degree of belief".
Does probability exist?
In the 19th century a rather depressing philosophical school of thought, usually attributed to the mathematician Marquis Pierre-Simon de Laplace, became popular which proposed that there was no such thing as chance, only uncertainty, i.e. that there is no randomness in the world and an omniscient being or machine, a "Laplace Machine", could predict any future event given perfect information about the present. This was the foundation of the physics of the day, Newtonian physics, and even Albert Einstein believed in determinism of the physical world saying the often quoted "God does not play dice with the Universe".
Heisenberg's Uncertainty Principle, one of the foundations of modern physics and, in particular, quantum mechanics, shows us that this is not true at the molecular level, and therefore subtly at any greater scale. In essence, it states that the more one characteristic of a particle is constrained (for example, its location in space), the more random another characteristic becomes (if the first characteristic is location, the second would be its velocity). Einstein tried to prove that it was our knowledge of one characteristic that we are losing as we gain knowledge of another characteristic, rather than any characteristic being a random variable, but he has subsequently been proven wrong both theoretically and experimentally. Quantum mechanics has so far proven itself to be very accurate in predicting experimental outcomes at the molecular level where the predictable random effects are most easily observed, so we have a lot of empirical evidence to support the theory. Philosophically, the idea that everything is pre-determined (i.e. the world is deterministic) is very difficult to accept too, as it deprives us humans of free will. The non-existence of free will would in turn mean that we are not responsible for our actions - we are reduced to complicated machines and it is meaningless to be either praised or punished for our deeds and misdeeds, which of course is contrary to the principles of any civilization or religion. Thus, if one accepts the existence of free will, one must also accept an element of randomness in all things that humans affect.
Kolmogorov considered that probability could co-exist with a deterministic world when inter-relationships between the elements of the world are so complex and susceptible to subtle changes that they cannot ever be predicted. The intriguing work of Stephen Wolfram among others, shows that even minute changes to a highly ordered system of simple interactions can result in a change from a repeatable pattern to seeming randomness.