Pages

Monday, July 3, 2017

A Closer Look at SDL's new MT announcements

SDL recently announced some new initiatives in their machine translation product SDL Enterprise Translation Server (ETS). As with most press releases, there is very little specific detail in the release itself, and my initial impression was that there really was not much news here other than the mention that they are also doing Neural MT. Thus, I approached my SDL contacts and asked if they would be willing to share some more information with me for this blog, for the reader base who were curious about what this announcement really means. I initially set out with questions that were very focused on NMT, but the more I learned about SDL ETS, the more I felt that it was worth more attention. The SDL team were very forthcoming, and shared interesting material with me, thus allowing me to provide a clearer picture in this post of the substance behind this release, (which I think Enterprise Buyers, in particular, should take note of), and which I have summarized below.

The press release basically focuses on two things:
  1. An update to SDL ETS 7.4, “a secure Machine Translation (MT) platform for regulated industries,” (though I am not sure why it would not apply to any large global enterprise, especially those that depend on a more active dialogue with customers like eCommerce),
  2. The availability of Neural Machine Translation (NMT) technology on this secure and private MT platform.

Why is the SDL ETS announcement meaningful?

The SDL ETS platform is an evolution of the on-premises MT product offering that has been in use in Government environments for over 15 years now and has been widely used in national security and intelligence agencies in particular, across the NATO block of countries. Given the nature of national security work, the product has had to be rock solid, and as support-free as possible as anti-terrorist analysts are not inclined to, or permitted to make calls for technical support to the MT vendor.  Those of you who have struggled through clunky, almost-working-MT-onsite-software from other MT vendors, who are less prepared for this low-support-requirement use-case, will probably appreciate the value of SDL's long-term experience in servicing this market need.

As we have seen of late, determined hackers can break into both government and corporate networks and do so on a regular basis. This graphic and interactive visualization is quite telling in how frequently hackers succeed, and how frequently large data sets of allegedly private data are accidentally exposed. So it is understandable that when the new SDL management surveyed large global enterprises priorities, they found that "Data Security and Data Privacy" were a key concern for many executives across several industries.

In a world where MT is a global resource, and 500 Billion words a day are being translated across the various public MT portals, data security and privacy have to be a concern for a responsible executive, and, any serious corporate governance initiative. While the public MT platforms have indeed made MT ubiquitous, they also generally reserve the right to run machine learning algorithms on our data, to try and pick up useful patterns from our usage and continue to improve their services. Today, MT is increasingly used to translate large volumes of customer and corporate communications, and it is likely that most responsible executives would rather not share the intimate details of their customer and intra-corporate global communications with the public MT services where privacy could be compromised.

If you think that your use of MT is not monitored or watched, at least at a machine learning level, you should perhaps take a look at the following graphic. This link provides a summary of what they collect.




The notion that these MT services are “free” is naive, and we cannot really be surprised that the public MT services try to capitalize on what they can learn from the widespread use of their MT services. The understanding gained from watching user behavior not only helps improve the MT technology, it also provides a basis to boost advertising revenues, since an MT service provider has detailed metrics on what people translate, and search on, in different parts of the world. 

To adapt the original ETS platform to the different needs of the global enterprise market, SDL had to add several features and capabilities that were not required for national security information triage applications, where MT was invariably an embedded component service, interacting with other embedded components like classification and text analytics in a larger information analysis scenario. The key enhancements added are for the broader enterprise market, where MT can be an added corporate IT service for many different kinds of applications, and the MT service needs direct as well as embedded access. The new features include the following:
      • A redesigned and intuitive interface that improves the user experience for product installation, administration as well as ongoing operation and management to respond to changing needs. As the GUI is web-based, no installations are required on individual user machines.  Users and Admins can easily get up to speed using SDL ETS via its Web GUI.
        • The new browser-based user interface includes features like Quick Translate, Browse Translate, Host Management, and User Management
      • Scalable architecture accommodates low and high translation throughput demands. The addition of a load balancer for automatic distribution of client requests which manages available MT resources to facilitate throughput and translation services synchronization in an efficient manner.
      • Time to deployment is minimized with various kinds of installation automation.  SDL ETS can be deployed swiftly without the need to install any extra third-party software components manually.  SDL ETS services automatically restart upon system restart as they are automatically installed as OS services for both Windows and Linux. (This is in contrast to most Moses based solutions in the market.)
      • User roles & authentication
        • Enable user access via permission-based login and/or authenticate against corporate’s central Active Directory with LDAP.
      • Scaling and managing SDL ETS deployments are made easy with centralized Host Management.  Admins no longer need to access individual ETS servers and modify configuration files.  Setup can be done via the SDL ETS Web GUI’s Host Management module and includes things like loading custom dictionaries for specific applications.
      • Includes state-of-the-art Neural Machine Translation technology, offering leading-edge technology for the highest quality machine translation output
      • Highly tuned MT engines that reflect the many years of MT developer engagement with SDL human translation services, and ongoing expert linguistic feedback that is a driving force behind higher quality base translations
      • Ease of access through an MS-Office Plug-in and a rich REST API for integration with other applications and workflows
      • Enhanced language detection capability
        • Support the automatic detection of over 80 languages and 150 writing scripts.
 


How does SDL ETS differ from other MT on-premise solutions?

Based on my experience with, and observation of other on-premise MT systems, I think it is fair to say that the SDL ETS features are a significant step forward for the translation industry in bringing the capabilities of industrial strength MT up to modern enterprise IT standards and needs. In #americanbuffoon-speak we might even say it is tremendous and bigly good.


Based on what I was able to gather from my conversations, here is a list of distinctive features that come to mind. Note that most of these current updates relate to the improved UX, the elegant simplicity of  SDL ETS, and, the ease of ongoing management of changing needs and requirements from an enterprise customer viewpoint.
  • More scalable and elastic, and easier for the customer to manage without calling in technical experts from the MT vendor
  • Ease of administration and ongoing management and maintenance of different corporate applications that interact with MT translation services
  • Powered by SDL’s proprietary PB-SMT & NMT technologies
  • Efficiency of architecture – fewer servers needed for the same amount of work

 


A closer look at SDL’s Neural MT


Except for the NMT efforts at Facebook, Google, and Microsoft, most of the NMT initiatives we hear about in the translation industry today, are based on open source solutions built around the Torch and Theano frameworks. While using open source allows an MT practitioner to get started quickly, it also means that they have to submit to the black box nature of the framework. Very few are going to be able to go into the source code, to fundamentally alter the logic and mechanics of the framework, without potentially damaging or destabilizing the basic system. The NMT systems that practitioners are able to develop are only as good as the data they use, or their ability to modify the open source codebase.

In contrast to this, the SDL NMT core engine is owned and developed 100% in-house by SDL, which allows much greater flexibility and deep underlying control of the basic logic and data processing flows of the NMT framework. This deeper control also allows a developer more alternatives in dealing with NMT challenges like limited vocabulary, performance/speed and changing the machine learning strategies and techniques as the market and the deep learning technology evolve e.g. switching from recurrent neural networks (RNN) to convolutional neural networks (CNN) deep learning strategies as Facebook just did

My sense, based on my limited understanding, is that owning your NMT code base very likely affords more powerful control options than open source alternatives allow, because problem areas can be approached at a more fundamental level, in the well-understood source code, rather than using workarounds to handle problematic output of open source black-box components. It is also possible that owning and understanding the code base also results in longer term integrity and stability in the code base. This is also probably why the big 3 choose to develop their own code base over using open source components and foundations. The SDL system architecture reflects 15+ years of experience with data-driven MT and is designed to allow rapid response to emerging changes in machine learning technology, like a possible change to CNN from the current RNN + Attention approach that everybody is using.

In my conversations with SDL technical team members, it became apparent that they have a much greater ability to address several different NMT problem areas:

  • Vocabulary – SDL has multiple strategies to handle this issue in many different use scenarios – both when the source data universe is known, and also when it is unknown and developers wish to minimize unknown word occurrences.
  • Neural Babble – NMT systems often produce strange output that the SDL developers call neural babble. One such scenario is when the output produces the same phrases, mysteriously repeated multiple times. SDL has added heuristics to develop corrective strategies to reduce and eliminate this and other errant occurrences. This is an area that open source NMT systems will be unable to resolve easily and will need to add pre- and post-processing sequences to manage. 
  • Speed/Performance issues can be better managed since the code base is owned and understood so it even possible to make changes to the decoder if needed. SDL is testing and optimizing NMT performance on a range of GPUs (Cheap, Mid-range & Premium) to ensure that their client base has well understood and well-tested deployment options.  
  • Rapid Productization of Deep Learning Innovation: Owning the code base also means that SDL could easily change from the current deep learning approach (RNN) to new deep learning approaches like (CNN) which may prove to be much more promising and efficient for many applications that need better production performance. This agility and adaptability can only come from deep understanding and control of how the fundamentals of the NMT system works.


The NMT customization and adaptation options are currently being explored and benchmarked against well understood PB-SMT systems. Initial results are providing great insight into specific data combination and pruning strategies that result in the best custom NMT system output. SDL's long-term experience building thousands of custom systems should be invaluable in driving the development of superior custom NMT solutions. The research methodology used to investigate this follows best practices (i.e. they are careful and conservative, unlike the over-the-top claims by Google) and we should expect that all production NMT systems will be significantly superior to most other alternatives. While I am not at liberty to share details of the benchmark comparisons, I can say that the improvements are significant and especially promising in language combinations that are especially important to global enterprises. The SDL team is also especially careful in making public claims about improved productivity and quality (unlike some MT vendors in the market), and are gathering multiple points of verification from both internal and customer tests to validate initial results which are very promising.


I expect that they will also start (or possibly have already started) exploring linking their very competitive Adaptive MT capabilities with high-quality NMT engines. I look forward to learning more about their NMT experience in production customer environments.

Happy 4th of July to my US friends, and here is an unusual presentation of the national anthem that is possibly more fitting for the current leadership.

1 comment:

  1. An updated summary of what Google knows about you - probably much more than your mother knows https://visual.ly/community/infographic/technology/how-much-does-google-really-know-about-you

    ReplyDelete