This study explores the use of automated linguistic features in English Foreign Language (EFL) argumentative writings. Coh-Metrix, an advanced computational text analysis tool, was used to analyze 87 linguistic properties of argumentative essays written by thirty-six advanced EFL students, track their development over a semester long writing course, and examine the degree to which high and low quality essays can be predicted by objective linguistic measures of cohesion, syntactic complexity, and lexical diversity. The multidimensional analyses demonstrated that significant growth in the three macro linguistic features in EFL learners occurred as a function of time spent in the advanced writing class.Such a growth, however, did not match human holistic judgments ofwriting quality and features showed growth were not successful at predicting group membership as low and high quality essays. Using regression analysis, this study found that three sets of automated measures (e.g. connectives, syntactic pattern density, and word information) were strong predictors of holistic ratings. The predictors used in each successful model support the idea that essays containing complex syntactic structures (e.g. adverbial complexity) and diverse vocabulary (e.g. less concrete and less meaningful words) are generally perceived to be of high quality. Furthermore, the results of this study lend weight to the idea that coherent texts have more explicit cohesive features (e.g. connective devices), at least in EFL written texts.These findings hold important implications for writing pedagogy, writing development, and writing assessment in EFL context. Keywords: Coh-Metrix; Multidimensional Text Analysis; Linguistic Features; Cohesion; Syntactic Complexity; Lexical Diversity; Writing Quality; Computational Linguistics