Dynamic programming is an approach to solving problems with optimal substructure and overlapping sub-problems. All problems that are solvable using a dynamic programming approach have these two attributes.
The divide and conquer approach is a related, but distinct from dynamic programming in that divide and conquer only applies to non-overlapping sub-problems.
All problems that are solvable using a dynamic programming approach have these two attributes.
A problem has optimal sub-structure when it can be decomposed into simpler sub-problems.
A problem has overlapping sub-problems when it requires the same sub-problems to be solved multiple times. This property can be exploited to reduce time complexity by simply remembering the solutions to sub-problems instead of re-calculating them multiple times.
Tabulation refers to a method of solving problems in increasing order of complexity. Solve the simplest problems first and then use those solutions to compose solutions to more complex problems.
Remembering the solution to a problem given a distinct set of inputs is called memoization. It's functionally similar to caching.
This generally depends on the answer to this question: Do you need to compute solutions to all sub-problems?
If not: Memoization allows solutions to be computed lazily. So, when a problem does not require all sub-problems to be solved, memoization generally has less time complexity.
But if it does: Tabulation doesn't require recursion (and therefore doesn't impose the overhead of the call-stack) and allows memory to be pre-allocated (therefore using it more efficiently).