Localising Video Content: Why Choose Subtitles and What to Consider

Currently the main options for localising video content are subtitling and dubbing. Which of those is the best fit in each situation depends on a variety of factors, however for a number of reasons, in recent times subtitles have proved the most common choice. For example in the UK, 75% of viewers prefer subtitles to dubbing when watching foreign language content. 

In this blog we’ll discuss the main reasons to choose subtitles, and what should be taken into account in relation to them.

Why subtitles?

In the past, subtitles (or captions when referring to the format which displays text for all audio as well as speech e.g. music and sound effects), were mainly a feature reserved for those hard of hearing, or for watching foreign language content. While these purposes are still relevant, nowadays subtitles are popular with a wider demographic for a number of other reasons.

1. Firstly, subtitles facilitate watching content on the go, something which is increasingly important in today’s mobile world. Recent research by Verizon Media and Publicis Media found that 69% of consumers are viewing video with the sound off in public places. 

Subtitles allow users to engage with content without the need for earphones, which comes in handy when on the train commute or taking a coffee break in a busy cafe. It’s important to consider how content will be consumed when creating it. For example instagram stories, short videos designed to be watchable on the go, are becoming more and more visual focused, reducing the reliance on audio.

2. In addition to short-form videos, there is also a trend for young people to watch TV shows with subtitles. According to recent research by Stagetext, this age group are four times more likely to use them than older age groups, despite having fewer hearing problems. 

Stagetext’s CEO Melanie Sharpe claims that this is because subtitles are the norm for young people, who are already more used to taking in information more quickly than their older peers. As the volume of digital content only continues to grow, we can assume that new generations of viewers will follow the same pattern and opt for the written word to accompany their favourite new series.

3. Lastly, from a budget perspective subtitles are a much cheaper option than dubbing since the subtitling workflow requires fewer stages and resources. To create subtitles, a transcriber uses specialist software to allow them to convert the spoken word into time stamped segments which will fit on screen consecutively. When the subtitles are required in a different language than the audio, the transcriber will also have translation expertise to be able to transcribe the language A audio into language B subtitles. 

As for dubbing, following the transcription stage, the voiceover artist then produces the localised audio content in a recording studio. As a result of the additional stage and related resources required, a good dub can therefore cost ten times the amount of the equivalent subtitles.

How to optimise subtitles

For any subtitles, whether that be in the same language as the audio or another, it is essential that they meet the industry-standard technical specifications.

For example, for reading speed, based on the recommended rate of 160-180 words per minute, subtitles should appear on screen for a minimum period of around 0.3 seconds per word. This should be taken into account when defining the timestamps per subtitle. I’m sure that we can all agree that there’s nothing worse than having to pause a video to read the subtitle because it disappears from the screen too quickly. In addition, UTF-8 encoding should be used to ensure that foreign letters are displayed correctly (e.g. À bientôt in French). It is important that a subtitle provider has a thorough process for ensuring compliance with such specifications to optimise the viewing experience for users.

When it comes to localised subtitles, it is crucial that the transcriber is a language expert who is able to effectively localise the ideas of the source language in the translation. They should have the relevant qualifications and experience required to produce high quality translated subtitles so that the viewer reading them has the same depth of understanding of the content as the viewer listening to the audio.

Netflix has recently been put into the spotlight regarding its foreign language subtitles. The streaming site has hosted content in a range of languages since its inception, but the global success of Squid Game last year has called attention to the importance of good subtitles. In the series, highlighted issues include the Korean word “oppa”, a term of endearment for an older male, being translated as “babe” in the English subtitles. Although both terms can be considered endearing, it is clear that this was an inappropriate English translation which would most likely confuse any watcher relying on the subtitles.

Conclusion

When localising video content, subtitles are favourable especially when targeting users on the go and younger audiences, and as a budget-friendly option. Subtitle providers should be carefully considered to ensure that the delivered product is to a high technical and linguistic standard.

ICS-translate has a wealth of expertise in localisation and subtitling. As an ISO 17100:2015 certified language service provider, we meet the requirements for the core processes, resources, and other aspects necessary for the delivery of a quality translation service. We have an in-depth quality assurance procedure to guarantee that the finished product meets the technical specifications. 

Furthermore, our qualified linguistic experts take into account cultural nuances when translating to make sure that the subtitles are a true representation of the original spoken language. To find out more about our subtitling service, please get in touch.

Similar Posts

No items found.