While for some the question of watching a show or movie with or without closed captions is simply a matter of preference, for others, closed captions and professional video subtitle translations allow them to watch it at all. And 2020 was a big year for subtitles. In January 2020, the internet was abuzz over director Bong Joon Ho’s Golden Globes acceptance speech, in which he said – as translated by his interpreter, Sharon Choi – “once you overcome the one-inch-tall barrier of subtitles, you will be introduced to so many more amazing films.” And once people began staying at home more often to help prevent the spread of COVID-19, streaming content began to be consumed at a higher rate than ever.
Subtitles ≠ Closed Captions
Before we go any further, it is important to acknowledge the difference between closed captions and subtitles. Closed captions are essentially a written transcript of the audio elements in the program’s original language. They are meant to provide full and equal access to audio content for individuals who are deaf or hard of hearing. Subtitles, on the other hand, are translations of dialogue, text displayed on-screen, and other linguistic elements that allow viewers to understand a video in a foreign language.
Closed captions and subtitles also operate under different constraints. According to the US Federal Communications Commission (FCC), closed captions must “match the spoken words in the dialogue and convey background noises and other sounds to the fullest extent possible.” The same is not necessarily true of subtitles, which typically do not include textual renderings of audio effects like background noises or music.
Subtitles Best Practices
Factors like average reading speed and cognitive overload also affect how professional video subtitle translations are done. Maximizing readability helps to provide maximum appreciation and comprehension of the target program. And sometimes maximizing readability means not translating the source content word-for-word.
This means that subtitling is a unique form of translation. Paulina Grande, the COO of Mexico-based translation studio Producciones Grande, describes it as “as much about adapting the work as it is translating it,” and argues that “our adapters need to know about scriptwriting as much as they do languages, because otherwise, you’re only doing half the work.” Arguably, when considering best practices for professional video subtitle translations, linguistic knowledge is only one-third of the equation. The other two-thirds are artistry, as Grande argues, and logistics. Determining how many words can fit on the screen, where, in what configuration, and for how long, all while communicating the meaning of the original language, is truly an exercise in strategy. The following are some best practices in subtitling:
- Location
- Subtitles should be at the bottom of the screen, except in certain circumstances. One of these is when on-screen text appears at the bottom of the screen, in which case the translation should appear at the top of the screen if possible.
- Subtitles are typically centered.
- Line breaks and format
- Each subtitle should ideally contain one complete sentence.
- Longer sentences should be broken over two lines (and if necessary, may need to be broken into multiple subtitles) at the highest syntactic node. This means that it is preferable to break a subtitle after a noun or verb rather than after a preposition or article.
- Both lines of a two-line subtitle should be as equal in length as possible. If word length and segmentation at the highest syntactic node do not allow for this, having a longer top line is preferable to having a longer lower line.
- Timing
- A general rule of thumb is to keep the subtitle on the screen with just enough time to read it plus 0.25 – 0.5 seconds for cognitive processing. Given average reading speeds, this means one second per 2.5 – 3 words.
- Visual cuts should also be considered. A subtitle should disappear before a cut that indicates a thematic change.
A professional trained in audiovisual (AV) translation will be aware of these parameters and produce subtitles that are not only linguistically meaningful in the target language, but also allow viewers to engage with other elements of the content (ie, they do not spend the entire time reading words on the screen).
Troubles with Subs
While Bong Joon Ho’s assessment that English-speaking moviegoers prefer to watch movies in English may be accurate, the demand, domestic and global, for video content in languages other than English is also evident. In July of 2020, Telemundo, a North American Spanish-language TV network, became the first U.S. broadcast network to surpass 10 million subscribers on its main YouTube channel, and the company has more than 35 million subscribers total to its 11 YouTube channels. The Observer sums up the business sense of subtitles quite well: “treating translation as an afterthought […] comes at the expense of limiting or alienating a large segment of […] potential customers.” Low-quality subtitles that have linguistic errors or ignore time and visual constraints can mean that the message of your production is lost on a non-English-speaking audience, or that the viewing experience is so poor that viewers do not bother.
AV content giants Netflix and YouTube have spotty histories where subtitles are concerned. Netflix launched a community subtitling project through video captioning provider Amara in 2012, but shifted gears to roll out their own program, Hermes, in 2017. Hermes was meant to be a database of qualified subtitlers for various types of content between various language pairs but was discontinued only a year after its launch. Allison Smith, the Program Manager of Localization Solutions at Netflix at the time, spoke on the issue at the Languages & The Media 2018 conference in Berlin, explaining that the Hermes approach was more than Netflix was equipped to handle and that they had decided to move forward exclusively with their established localization vendor partnerships. Smith also admitted that while Hermes was Netflix’s attempt to “own the full process” of localization, ultimately they felt that localization vendors were “better suited to use their core competencies and add value to the content localization ecosystem.” In other words, Netflix realized that it made more sense to rely on localization vendors’ expertise than to build it themselves from the ground up.
In 2015, YouTube launched a toolkit enabling community contributed subtitles and translations of video titles and descriptions. Creators could either appeal to their viewers to contribute subtitles, or they could connect with willing translators via a YouTube-hosted marketplace. This capability outlasted Hermes by several years, but come October 2020 the Community Contributions feature will be no more due to a lack of use and reports of spam and abuse.
A Case Study: Netflix
Less than a year after Languages & The Media 2018, Netflix’s subtitles were still coming under fire. In May 2019, Jean-Luc Wachthausen wrote an article for Le Point which was later translated by Frenchly.us as “Netflix has Terrible Subtitles.” This is an arguably harsher interpretation of the original French title, which translated directly would be “Why Netflix’s subtitles border on amateurism.” Neither version gives a vote of confidence. The article begins by addressing French organization Association des traducteurs-adaptateurs de l’audiovisuel (Association of Audiovisual Translator-Adaptors, ATAA)’s statement released a few months prior regarding the French subtitles of the Academy Award-winning film Roma. The statement, written by the organization’s VP Sylvestre Meininger, points out both linguistic and technical errors, and goes so far as to speculate that Netflix had enlisted the services of unskilled and low-cost contractors, despite the company’s claims that they work with professional language service providers.
Wachthausen is perhaps a bit more generous than Meininger, as he acknowledges that Netflix has made “efforts to establish satisfactory quality standards for dubbing and subtitling in a highly competitive market.” But he also blames the immense amount of content on the platform and pressure to deliver finished products quickly for “sometimes sloppy” subtitles. And despite the assurance of French post-production company VDM, who works with Netflix, that the streaming company requires nothing but the highest quality, translators as well as layman viewers still complain about the quality of the platform’s subtitles. The hashtag #TraduisCommeNetflix (#TranslatedLikeNetflix) and the Tumblr thread Les sous-titres de la honte (Subtitles of Shame) are two online manifestations of francophones’ dissatisfaction with the French subtitles on Netflix. Wachthausen writes that Netflix monitors these complaints and forwards them to their translation providers. And while taking these complaints into account is better than ignoring them, one could argue that they should not have been necessary in the first place. The types of inaccuracies cited seem to suggest that, in the words of the Observer, Netflix is “treating translation as an afterthought.” Meininger would agree, as he writes that “the ‘subtitling’ of Roma sheds […] light on the importance that this distributor gives to the content it airs and the audiences likely to be watching.“ He also argues that while he has noticed some corrections made to erroneous subtitles, “the new version is seldom better than the first.” Ouch.
You may also have seen one of many articles circulating about the English subtitles for Squid Game. The show was an instant hit worldwide, but along with the critical acclaim for the show itself came a barrage of criticism from Korean-speaking fans about the quality of the English subtitles. While one viewer described them as “botched,” the criticism for Squid Game‘s English subtitles seems to be much more nuanced than that of the French subtitles for Roma. For example, the first result when you search “Squid Game subtitles” on English-language Google reminds viewers that the English closed captions will be different from the English subtitles. Why? Because the closed captions are based on the English-language dubbed version of the show. And dubbing is truly a horse of a different color in the AV translations world! The translations for a foreign language dub have to be adapted to more closely match the movements of actors’ mouths. Otherwise, it can be a very off-putting viewing experience to listen to an audio track that seemingly has nothing to do with what characters are saying on screen.
Several articles also argue that part of professional video subtitle translations is adapting the source content to the target audience. And while Korean pop culture certainly has a significant fan base in the United States, it wouldn’t be unheard of for a translator to invoke their artistic license if they think the target audience won’t understand a certain reference, or if a turn of phrase doesn’t have the same punch. A silly example would be the joke in English: Q: Why was 6 afraid of 7? A: Because 7, 8, 9. For any language in which “ate” and “8” aren’t pronounced the same way, a direct translation of this joke wouldn’t make any sense, let alone be clever or funny. A good translator would instead utilize a comparable joke in the target language: one that a child might tell and relies on wordplay. (If you want to learn more about creative complications like this, check out our post on common challenges in marketing translations!)
Strategies for Subtitling
Another session at Languages & The Media 2018 allowed translators in attendance to share feedback on capabilities that Netflix’s subtitling software should include. The most popular feature requests were spellcheck and autocorrect capabilities, an offline version of the cloud platform, and translation memory (TM) integration. Language Solutions’ processes involve all of these features! Our translators and editors work in computer-assisted translation (CAT) software with spellcheck and quality assurance functions, we share file-based resources so that our teams can work on projects on- or offline, and TM maintenance is part of the workflow for every project we complete. Most CAT tools only have very basic autocorrect capabilities but features like termbases and AutoSuggest may allow CAT tool users to do less manual typing than they would on a different platform.
One commonality between many of the articles cited in this post is that there is still a lot of mystery around streaming platforms’ approach to subtitles. The inconsistent availability and quality of subtitles across platforms like Netflix, Hulu, and HBO Max reveals that they have yet to find a truly successful strategy. The examples of Netflix’s Hermes and YouTube’s Community Contributions feature do seem to indicate that companies are coming around to the idea that the heavy lifting of localization should be left to industry professionals. Language Solutions has worked with international clients like Vocera and Visa on professional video subtitle translations and we’ve also written about the ins and outs of professional video translation services. If you’d like to bring your video content to an international audience, let us bring our proven processes and network of trained AV translation professionals to your subtitling projects. Have questions or want to get the conversation started? We’d love to hear from you!