In general, large scale linear programming (LP) solving algorithms are not efficient. So many researches have been done for developing special algorithms for this group of problems. Using of different methods, researcher have introduced variety of algorithms each of which have their own special trait. There are mainly three approaches to solving linear programmings: 1 - Searching basic feasible solutions, 2- Interior point methods, and 3- Exterior penalty functions. In this thesis, referring to the previous researches, an exterior penalty function-an unconstraint optimization problem- is developed.This function is a piecewise quadratic convex function which is equivalent to our primal problem and solving this optimization problem leads to solving primal problem. In previous research, a fast newton algorithm is utilized to solve this function. Newton method is able to solve linear programming problems which their difference between number of constraints and number of variables are very large. In other words, newton method solves linear programming problems with large number of constraints and moderate number of variables. This method acquires solution of least-2- norm of linear programming. For improving the efficiency of newton method, we used a newaccelerated conjugate gradient method with finite hessian/vector approximation with fixed step size for solving large-scale linear programming problems. Direction of this new algorithm is approximating newton’s direction and does not need to compute step size in its iterations. In other word, Wolfe conditions for determining step size which have very expensive computational cost are not used. For comparing our proposed algorithm with other algorithms, over 1,000 random linear programming problems with their optimal point and objective function are generated. Efficiency of this algorithm is compared with several other algorithms like newton method and CPLEX-Dual software. For having comprehensive conclution of efficiency of proposed algorithm, Dolan and More performance profile is used.It is shown that this new algorithm can solve linear programming problems which have close number of variables and constraints. Performance of this algorithm compared to newton algorithm and CPLEX-Dual software for solving large-scale linear problems with different structures is shown to be more efficient. In moderate scale, performance of the proposed algorithm is near the CPLEX method while Newton method in many problems fails to solve the linear programming problem. In large scale problems, It is shown that propsed algorithm is versy efficient. Over the 90 percent of the large scale linear programming problems was solved quiker than other algortihms. Difference between run times of proposed algorithm and CPLEX-Dual software is very high. In conclution, a new conjugate gradient method is proposed. In this algorithm, fixed step size is used in all the iterations in all the problems. Accelerated scheme proved to be very effective in the proposed algorithm.