For the security guard, hearing a single noise is likely to be nothing. However, what if you heard two noises, and the sound of tires outside?
Same thing with the doctor. Most good doctor's I know have a sixth sense, about when something is off and needs further tests beyond just take an aspirin. So maybe the person had a stomach ache, and they had lost some weight, and they were looking a little yellow. All of a sudden the probabilities start looking a lot different.
You would think Bayesian inference is good at integrating multiple information sources but practically you have to model the dependencies between different information sources and even doing a good job of that doesn't save you away from logical fallacies such as "Explaining away". In real life people use Naive Bayes a lot because properly modelling a Bayesian network is hard and trying to learn the network gets you in all sorts of problems -- allow arbitrary dependencies between N inputs and you are talking eᴺ coefficients in your model and you'll never solve it.
This is one of the reasons why people got frustrated with Expert Systems as real-life reasoning requires reasoning with uncertainty and we don't have a satisfactory general way to do it.
The whole point of Bayesian networks is to have something that's asymptotically simpler than "arbitrary dependencies between N inputs" while still being able to model useful scenarios.
For the security guard, hearing a single noise is likely to be nothing. However, what if you heard two noises, and the sound of tires outside?
Same thing with the doctor. Most good doctor's I know have a sixth sense, about when something is off and needs further tests beyond just take an aspirin. So maybe the person had a stomach ache, and they had lost some weight, and they were looking a little yellow. All of a sudden the probabilities start looking a lot different.