Assessment is an essential tool in the teaching and learning process to measure the achievement of educational objectives. One of the more complex assessment methods is the essay test, which requires significant time for correction and can be influenced by subjectivity. Therefore, a system capable of automatically scoring essay answers is needed to assist teachers in assessing more quickly and objectively. This thesis examines automated scoring for short answers and develops an Automated Short Answer Scoring (ASAS) application by implementing sentence embedding and word embedding methods. ASAS is applied to assess students’ answers in the Computer Network Engineering subject. Experimental results show that the sentence embedding method achieved a correlation coefficient of 0.8132 with an MAE of 0.5096. In contrast, the word embedding method achieved a correlation coefficient of 0.67332 with an MAE of 0.9991. The correlation coefficient results indicate that the automatic scoring by ASAS using sentence embedding is more similar to manual scoring by teachers than the word embedding method. Thus, ASAS can be an effective tool in speeding up evaluation, reducing teachers’ workload, minimising subjectivity, and enhancing the consistency of assessment results. This application has great potential for widespread use in various educational contexts, especially when fast and accurate assessment is crucial. This study also contributes to the improvement of the quality of academic evaluation.