Showing posts with label TM Repository. Show all posts
Showing posts with label TM Repository. Show all posts

Apr 19, 2012

Kilgray survey until April 25th: ready, aim, respond!

Kilgray is conducting a survey until April 25th, the results of which will be discussed at the conference in May. This is an opportunity for users of memoQ and their other products to say what's working and what's not and perhaps influence the future course.

It's a short survey and should take little time to answer. The first question asks three things they are doing right, the second one asks which three things are wrong and need to stop, and further questions ask your opinion on the importance of maintaining independence (in contrast to SDL Trados, Star Transit or Wordfast, for example, which are owned by language service resellers which compete with agency customers and freelancers for business) and what "big idea" you might like to see implemented.

On several occasions in past years, I have seen Kilgray respond quickly to clearly expressed user feedback, and I believe that this is still a reasonable expectation. So have your say to help make some good tools better!

Apr 14, 2011

Coming to terms with Kilgray

The pre-conference day for memoQfest 2011 offered dual tracks in the morning session for the TM Repository and qTerm, the advanced, server-based terminology module for memoQ. Ever since attending a webinar when qTerm was released last October, I've been intending to blog on it, and since I already had an overview of what I needed to know about the Repository from last year, I chose the qTerm track. However the Twitter feed from Polish translator @wasaty made it clear that I was missing a lot of interesting news about Kilgray's TM management technologies.

István Lengyel and Gergely Vandor of Kilgray served a lot of meaty technical details on qTerm, many of which could be the subject of an entire blog post. It's a product with great potential I think, and I expect it will evolve considerably in the course of the next year. Gergely's technical insights on problems often encountered in data migration and how non-standard the TBX "standard" truly is were particularly interesting to me personally, revealing some useful and interesting information about data exports in that format from SDL's MultiTerm. And I thought the world of TMX was a big mess....

István's telling of the history of term management at Kilgray offered me a look at the very different world of translators of technical information working with small distribution languages. In some respects, that is a very different world from mine, and I very much appreciate how being shaken out of my comfortable German/English language pair perspective can sometimes help to flush my hardened arteries and get a little more oxygen to my brain. And the repeated reference to the role of terminology in branding by all three speakers in that morning session gave me some new ideas for how to help my direct clients and agencies understand even more clearly the importance of getting terminology right.

But like yesterday's train-the-trainer session, the real highlight of the day for me was not a technical presentation with specific details of a product I find interesting. It was a general discussion of purpose and philosophy in terminology management, by expert terminologist and consultant Barbara Karsch, who was deeply involved with terminology at JD Edwards and Microsoft before becoming an independent service provider to LSPs and corporate translation consumers. Her web site offers a lot of interesting information and definitions that are well worth reading. Her methodical presentation of the real costs involved in terminology mis- or non-management left little room for excuses and made a strong, objective case that any sober business person can appreciate. What I learned from her will help me make a better case to clients I value so they can help themselves. I very much look forward to getting a copy of the slides from that presentation.

Among all her valuable advice, however, one particular point stands out for me, an obvious one that I know well from my own experience. It applies to both terminologies and the collections of data often used inefficiently for terminology: translation memories. Without maintenance and updating, terminologies (and TMs) eventually become worthless. There is a definite life cycle which applies to a lot of this data, and all of the babbling about matches, fuzzy matches, etc. and the inertial complacency of many of us and our clients with regard to existing collections of data can too easily cause us to lose perspective and sacrifice future quality and reputation while indulging in the illusion of cost savings.

Those who make an effort to think clearly about the real costs of processes and decisions very often discover truths at odds with lazy common wisdom and benefit accordingly. When we get beyond the fear, uncertainty and doubt invoked by dishonest companies and "experts" with an agenda at odds with the interests of freelancers, public bodies and LSPs of acceptable size, we are very likely to arrive at decisions that seem risky to those blinded by propaganda.

Viewed objectively, the case is very clear for efficient, modern management of terminology with technologies such as those offered by Kilgray. And considering the larger technological context of the integrated environment in which tools such as qTerm work, the case for memoQ as a mother lode of value for translation management is just as clear. That is perhaps why I will be able to greet esteemed colleagues from SDL and other important contributors to the translation tools industry who will be attending memoQfest this year to divine the future directions of translation technology :-)

May 25, 2010

The TM Repository


One of the most interesting presentations for me at this year's memoQ Fest in Budapest was the presentation by Kilgray's CEO Balázs Kis on the company's new product in beta testing, the TM Repository. The concept was actually discussed at last year's conference, but it has matured a lot since then and developed into a web-based application (based on Microsoft IIS, ASP and SQL Server) for sophisticated management of translation memory resources from any source. Here are two more slides that emphasize aspects which I find particularly useful:




The complete presentation slide set can be downloaded here, and I think the talk will eventually be made available on YouTube.

It's important to understand that this is not a memoQ add-on. It is a tool intended to manage TMX data from any translation environment without metadata loss and allowing mapping of attributes for other systems, data maintenance and much more. It could be used by LSPs and corporations with extremely large Trados data set, for example, and it offers version control for the data. I'm not aware of another translation environment tool that does this in an efficient way.

Organizations interested in learning more about this technology and how it might support their workflows should contact Kilgray.