0% found this document useful (0 votes)
28 views

Quntiative Techniques 2 Sonu

Linear programming is a mathematical modeling technique that helps optimize a linear objective function subject to constraints. It has been useful for guiding quantitative decisions in business planning and industrial engineering. The basic steps to solve a linear programming problem using the simplex method or graphical method are to identify decision variables, formulate constraints and the objective function, and find the optimal solution that maximizes or minimizes the objective function value within the feasible region. Linear programming can help product managers make decisions around production planning and resource allocation to maximize profit or minimize costs.

Uploaded by

Sonu Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views

Quntiative Techniques 2 Sonu

Linear programming is a mathematical modeling technique that helps optimize a linear objective function subject to constraints. It has been useful for guiding quantitative decisions in business planning and industrial engineering. The basic steps to solve a linear programming problem using the simplex method or graphical method are to identify decision variables, formulate constraints and the objective function, and find the optimal solution that maximizes or minimizes the objective function value within the feasible region. Linear programming can help product managers make decisions around production planning and resource allocation to maximize profit or minimize costs.

Uploaded by

Sonu Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Apeejay Institute of Management & Engineering

Technical Campus

QUANTITATIVE TECHNIQUES
ASSIGNMENT 2
SUBJECT CODE- MBA 103-18

TOPIC- What is the linear programming, how it is helpful for decision making in
the Product Management?

SUBMITTED BY: SUBMITTED TO


Sonu Kumar Ms. PARAMJIT KAUR
MBA-1A
221139
[BATCH 2022-24]
Linear programming
linear programming, mathematical modeling technique in which a linear function is
maximized or minimized when subjected to various constraints. This technique has
been useful for guiding quantitative decisions in business planning, in industrial
engineering, and—to a lesser extent—in the social and physical sciences.

The solution of a linear programming problem reduces to finding the optimum value
(largest or smallest, depending on the problem) of the linear expression (called
the objective function)

subject to a set of constraints expressed as inequalities:

The a’s, b’s, and c’s are constants determined by the capacities, needs, costs, profits,
and other requirements and restrictions of the problem. The basic assumption in the
application of this method is that the various relationships between demand and
availability are linear; that is, none of the xi is raised to a power other than 1. In order to
obtain the solution to this problem, it is necessary to find the solution of the system of
linear inequalities (that is, the set of n values of the variables xi that simultaneously
satisfies all the inequalities). The objective function is then evaluated by substituting the
values of the xi in the equation that defines f.

Applications of the method of linear programming were first seriously attempted in the
late 1930s by the Soviet mathematician Leonid Kantorovich and by the American
economist Wassily Leontief in the areas of manufacturing schedules and
of economics, respectively, but their work was ignored for decades.

During World War II, linear programming was used extensively to deal with
transportation, scheduling, and allocation of resources subject to certain
restrictions such as costs and availability. These applications did much to establish
the acceptability of this method, which gained further impetus in 1947 with the
introduction of the American mathematician George Dantzig’s simplex method, which
greatly simplified the solution of linear programming problems.

However, as increasingly more complex problems involving more variables were


attempted, the number of necessary operations expanded exponentially and exceeded
the computational capacity of even the most powerful computers. Then, in 1979, the
Russian mathematician Leonid Khachiyan discovered a polynomial-time algorithm
—in which the number of computational steps grows as a power of the number of
variables rather than exponentially—thereby allowing the solution of hitherto
inaccessible problems. However, Khachiyan’s algorithm (called the ellipsoid method)
was slower than the simplex method when practically applied. In 1984 Indian
mathematician Narendra Karmarkar discovered another polynomial-time
algorithm, the interior point method, that proved competitive with the simplex
method.

Linear Programming Examples


Suppose a postman has to deliver 6 letters in a day from the post office (located at A) to
different houses (U, V, W, Y, Z). The distance between the houses is indicated on the
lines as given in the image. If the postman wants to find the shortest route that will
enable him to deliver the letters as well as save on fuel then it becomes a linear
programming problem. Thus, LP will be used to get the optimal solution which will be
the shortest route in this example.

Linear Programming Formula


A linear programming problem will consist of decision variables, an objective function,
constraints, and non-negative restrictions. The decision variables, x, and y, decide the
output of the LP problem and represent the final solution. The objective function, Z, is
the linear function that needs to be optimized (maximized or minimized) to get the
solution. The constraints are the restrictions that are imposed on the decision variables
to limit their value. The decision variables must always have a non-negative value which
is given by the non-negative restrictions. The general formula of a linear programming
problem is given below:

Objective Function: Z = ax + by

Constraints: cx + dy ≤ e, fx + gy ≤ h. The inequalities can also be "≥"

Non-negative restrictions: x ≥ 0, y ≥ 0

How to Solve Linear Programming Problems?


The most important part of solving linear programming problem is to first formulate the
problem using the given data. The steps to solve linear programming problems are
given below:

 Step 1: Identify the decision variables.


 Step 2: Formulate the objective function. Check whether the function needs to be
minimized or maximized.
 Step 3: Write down the constraints.
 Step 4: Ensure that the decision variables are greater than or equal to 0. (Non-
negative restraint)
 Step 5: Solve the linear programming problem using either the simplex or graphical
method.

Let us study about these methods in detail in the following sections.

Linear Programming Methods


There are two main methods available for solving linear programming problem. These
are the simplex method and the graphical method. Given below are the steps to solve a
linear programming problem using both methods.

Linear Programming by Simplex Method


The simplex method in LPP can be applied to problems with two or more decision
variables. Suppose the objective function Z = 40x1 + 30x2 needs to be maximized and
the constraints are given as follows:
x1 + x2 ≤ 12
2x1 + x2 ≤ 16
x1 ≥ 0, x2 ≥ 0
Step 1: Add another variable, known as the slack variable, to convert
the inequalities into equations. Also, rewrite the objective function as an equation.
- 40x1 - 30x2 + Z = 0
x1 + x2 + y1 =12
2x1 + x2 + y2 =16
y1 and y2 are the slack variables.
Step 2: Construct the initial simplex matrix as follows:

Step 3: Identify the column with the highest negative entry. This is called the pivot
column. As -40 is the highest negative entry, thus, column 1 will be the pivot column.

Step 4: Divide the entries in the rightmost column by the entries in the pivot column. We
exclude the entries in the bottom-most row.

12 / 1 = 12

16 / 2 = 8

The row containing the smallest quotient is identified to get the pivot row. As 8 is the
smaller quotient as compared to 12 thus, row 2 becomes the pivot row. The intersection
of the pivot row and the pivot column gives the pivot element.

Thus, pivot element = 2.

Step 5: With the help of the pivot element perform pivoting, using matrix properties, to
make all other entries in the pivot column 0.
Using the elementary operations divide row 2 by 2 (R2 / 2)
Step 6: Check if the bottom-most row has negative entries. If no, then the optimal
solution has been determined. If yes, then go back to step 3 and repeat the process. -10
is a negative entry in the matrix thus, the process needs to be repeated. We get the
following matrix.

Writing the bottom row in the form of an equation we get Z = 400 - 20y1 - 10y2. Thus,
400 is the highest value that Z can achieve when both y1 and y2 are 0.
Also, when x1 = 4 and x2 = 8 then value of Z = 400
Thus, x1 = 4 and x2 = 8 are the optimal points and the solution to our linear
programming problem.

Linear Programming by Graphical Method


If there are two decision variables in a linear programming problem then the graphical
method can be used to solve such a problem easily.

Suppose we have to maximize Z = 2x + 5y.

The constraints are x + 4y ≤ 24, 3x + y ≤ 21 and x + y ≤ 9

where, x ≥ 0 and y ≥ 0.

To solve this problem using the graphical method the steps are as follows.

Step 1: Write all inequality constraints in the form of equations.

x + 4y = 24

3x + y = 21

x+y=9

Step 2: Plot these lines on a graph by identifying test points.

x + 4y = 24 is a line passing through (0, 6) and (24, 0). [By substituting x = 0 the point
(0, 6) is obtained. Similarly, when y = 0 the point (24, 0) is determined.]

3x + y = 21 passes through (0, 21) and (7, 0).

x + y = 9 passes through (9, 0) and (0, 9).

Step 3: Identify the feasible region. The feasible region can be defined as the area that
is bounded by a set of coordinates that can satisfy some particular system of
inequalities.

Any point that lies on or below the line x + 4y = 24 will satisfy the constraint x + 4y ≤ 24.

Similarly, a point that lies on or below 3x + y = 21 satisfies 3x + y ≤ 21.

Also, a point lying on or below the line x + y = 9 satisfies x + y ≤ 9.

The feasible region is represented by OABCD as it satisfies all the above-mentioned


three restrictions.

Step 4: Determine the coordinates of the corner points. The corner points are the
vertices of the feasible region.

O = (0, 0)
A = (7, 0)

B = (6, 3). B is the intersection of the two lines 3x + y = 21 and x + y = 9. Thus, by


substituting y = 9 - x in 3x + y = 21 we can determine the point of intersection.

C = (4, 5) formed by the intersection of x + 4y = 24 and x + y = 9

D = (0, 6)

Step 5: Substitute each corner point in the objective function. The point that gives the
greatest (maximizing) or smallest (minimizing) value of the objective function will be the
optimal point.

Corner Points Z = 2x + 5y
O = (0, 0) 0

A = (7, 0) 14

B = (6, 3) 27

C = (4, 5) 33

D = (0, 6) 30

33 is the maximum value of Z and it occurs at C. Thus, the solution is x = 4 and y


= 5.

How LPP is helpful for decision making in the Product


Management?

Production Management:
LP is applied for determining the optimal allocation of such resources as materials,

machines, manpower, etc. by a firm. It is used to determine the optimal product- mix of

the firm to maximize its revenue. It is also used for product smoothing and assembly

line balancing.

PRODUCT MIX PROBLEM

A factory manufactures two products A and B. To manufacture one unit of A, 1.5

machine hours and 2.5 labour hours are required. To manufacture product B, 2.5

machine hours and 1.5 labour hours are required. In a month, 300 machine hours and
240 labour hours are available. Profit per unit for A is Rs. 50 and for B is Rs. 40.

Formulate as LPP.

Solution:

There will be two constraints. One for machine hours availability and for labour hours

availability.

Decision variables

4 X1 = Number of units of A manufactured per month.

X2 = Number of units of B manufactured per month.

The objective function:

Max Z = 50x1+ 40x2

Subjective Constraints

For machine hours 1.5x1+ 2.5x2 ≤ 300

For labour hours 2.5x1+ 1.5x2 ≤ 240

Non negativity x1, x2 ≥0

You might also like