ABSTRACT

Second language testing and assessment are not only a common EFL assessment activity but also a professional science drawing much attention from scholars and researchers. This chapter reports partial findings from an empirical study of assessing English as a foreign language (EFL) speaking skills between late 2015 and early 2016 at three universities in Southern Vietnam. The results highlight the methodological diversity in terms of test administration, speaking performance tasks, and raters’ scoring. Opinions from the candidates reveal that they need to be well informed of the assessment criteria before the test and raters’ feedback after the test. The findings suggest that more clearly specified assessment criteria and descriptors in the rating scales provide higher consistency in measuring spoken English abilities.