Skip to content

Commit 9ec40a6

Browse files
authored
Update README.md
1 parent c3d6a8e commit 9ec40a6

File tree

1 file changed

+8
-8
lines changed

1 file changed

+8
-8
lines changed

README.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -4,21 +4,21 @@ Dynamic programming is a computer programming method used to avoid computing
44
multiple time the same subproblem in a recursive algorithm.
55

66
Dynamic programming is applied to optimization problems where can be many possible solutions.
7-
Each solution has a value what can be find by solution with the optimal (minimum or maximum) value.
8-
This solution call an optimal solution to the problem.
7+
Each solution has a value what can be found by solution with the optimal (minimum or maximum) value.
8+
This solution calls an optimal solution to the problem.
99

1010
Dynamic programming method can design in 4 steps:
1111
1. Characterize the structure of an optimal solution.
1212
2. Recursively define the value of an optimal solution.
1313
3. Compute the value of an optimal solution in a bottom-up fashion.
1414
4. Construct an optimal solution from computed information.
1515

16-
<b>*</b>Some of the alogirthm problem can be solved by the "divide and conquer" strategy instead of Dynamic programming.
16+
<b>*</b>Some of the algorithm problems can be solved by the "divide and conquer" strategy instead of Dynamic programming.
1717
For example:
1818
- Merge sort
1919
- Quick sort
2020

21-
Solutions for this algorithms not overlapping sub-problems, so they not classified as dynamic programming problems.
21+
Solutions for these algorithms not overlapping sub-problems, so they not classified as dynamic programming problems.
2222

2323
### The Fibonacci
2424

@@ -93,7 +93,7 @@ T(n) = Ω(n2 ^ n/2) => T(n) = O(2^n)
9393

9494
<b>O(2^n)</b> - exponential time and O(n) space complexity for call stack size.
9595

96-
To describe <b>O(2^n)</b> time complexity, lets draw the recursion tree of calls,
96+
To describe <b>O(2^n)</b> time complexity, let's draw the recursion tree of calls,
9797
which will have depth n and intuitively figure out that this function
9898
is asymptotically <b>O(2^n)</b>.
9999

@@ -117,7 +117,7 @@ To prove this conjecture by induction, let shows recursion tree for F(5)
117117
```
118118

119119
In provide recursion tree of calls example, getFibonacciNumberRecursive(5) or F(5) function make
120-
multiple execution with same arguments:
120+
multiple executions with same arguments:
121121
- F(2) - 4 times
122122
- F(3) - 2 times
123123
- The leaves of the recursion tree will always return 1 (F(1) and F(0))
@@ -132,7 +132,7 @@ T(n) = O(2 ^ (n-1)) + O(2 ^ (n-2)) + O(1) = O(2^n)
132132
```
133133

134134
Consequently, the tight bound for this function is the Fibonacci sequence itself (~θ(1.6^n))
135-
whitch related to Golden ratio
135+
which related to Golden ratio
136136

137137
![Vizualization Golden ratio with numbers](https://en.wikipedia.org/wiki/Fibonacci_number#/media/File:FibonacciSpiral.svg)
138138

@@ -179,7 +179,7 @@ one execution with same arguments:
179179
- F(2) - 1 times
180180
- F(3) - 1 times
181181

182-
For optimization memoization method time complexity, we can storing the previous two numbers only
182+
For optimization memoization method time complexity, we can store the previous two numbers only
183183
because that is all we need to get the next Fibonacci number in series.
184184

185185
JavaScript iterative implementation

0 commit comments

Comments
 (0)