MT Evaluation


Neural Machine Translation Posting (MTPE) is often a good choice for faster time-to-market and reliable quality at a more affordable price. Neural Machine Translation has grown tremendously over the last decade.


However, in order to perform in an effective way, NMT needs careful evaluation and training.

Evaluating different MT outputs entails measuring and judging machine translation quality before starting an actual project. The evaluation is performed by qualified native speakers who will provide an assessment of text that has previously been translated by a machine.


One option is to evaluate and compare different outputs and choose the one that performs better according to the field of specialization of the text.


By evaluating MT outputs, you can ensure that the MT solution you implement will be the most reliable and the fastest to work with.


Furthermore, by analyzing your output, you can identify which aspects of the output are performing best and which one would need more corrections by translators. You can discover similar patterns of errors in the MT output, and ask translators to be particularly careful while dealing with them, thus getting better results in terms of quality. You might even discover that the output does not need any editing because it is already comprehensible and meet the quality standards required for your translation project; or, you could realize that post-editing is not what you need for a specific project instead, and that editing the output as required could eventually be too much time-consuming and frustrating.


Evaluating MT quality is very useful to developers and researchers of MT technology as well, because it helps them understand their efforts’ performances.


Creative Words can help you evaluate the machine that best fits your goals and provide quality translations to ensure you achieve it, on time and within budget.


Specifically, our qualified linguists can:

  • Compare different MT outputs and assign a score to each result individually to analyze which one performs best and why
  • Post-editing different MT outputs and track which solution is the most time-saving
  • Track effort savings, meaning quantify how much time will be saved on average by post-editing a specific output, instead of translating source text from scratch
  • Identify each output weak spot, in order to suggest a list of best practice to attach to the post-editing project and making the job easier for post-editors


We make sure different linguists perform the assessment in order to make results more reliable and to avoid human bias.


If you are looking to find the translation solution that will best serve your project, according to your quality and budget expectations, we’re here to help.


Creative Words, servizi di traduzione, Genova