What is difference between problems of conditional probability and bayes' theorem?
P(A|B)=P(B|A)P(A)/P(B)
As one cane notice there are two conditional probabilities in this formula. Bayes formula is a tool to interchange dependent events A and B
I suppose by “conditional probability” you meant this:
P(A|B)=P(A∩B)/P(B)
The former is usually used in bayesian inference and in models where you are interested in the distribution up to a normalizing factor (P(B), while the latter is usually used just to compute the conditional probability and events A and B are relatively simple. Note that if you just want to compute P(A|B) then using Bayes formula is useless, since you replace P(A|B) with equivalently difficult object P(B|A)
Conditional probability is a useful concept, and Bayes' theorem is a useful tool for making calculations with conditional probabilities.
Bayes' theorem is how to flip conditional probability. If you know P(X|Y) (the probability of X given Y), Bayes' theorem tells you how to calculate P(Y|X).
For example, let's say you want to know the probability of a car accident given that someone is drunk. This itself could be hard to get data for. However, Bayes' theorem tells you how to use the probability that someone is drunk given that they were in a car accident.