Machine Translates with your own language
Machine Translation which expresses the words to understand his/her own language. The translation machine is just way to understand of their own language. But it’s unable to translate the text for not cant translation the full phrase and close part of their target language. The corpus who is solving this problem by statistic techniques which are growing very well and it was leading a better translation. The machine translation software often allows for customization by domain or profession (example weather report) at present. Its improving output by limiting the scope of allowable substitions.
This technique is very effective in domain where formal and formalic language is used. The governments and legal documents are following with this translation machine and it’s also rapidly produces usable output than conversion. The human intervention has improved output quality .some system which are very effective achieved by human and that’s translate more accurately if the user has unambiguously identified which words in the text are names. Machine translate has proved that it is very useful tools to assist human translators and its even can produce output that can be used as weather report.
The progress and potential of machine translation have been debated much through its history. Since the 1950s, a number of scholars have questioned the possibility of achieving fully automatic machine translation of high quality. Some critics claim that there are in-principle obstacles to automatizing the translation process.
The machine translation idea may be invented in 17th centuary.Rene Descartes did propose a universal language with equivalent ideas in different tongues with sharing one symbol.Yehosha BaHill began his research which is first time in the field at MIT in 1951 and the research team of a George town has followed of its system in 1954.Mchine translation programs has popped up in Japan and Russia in 1955.the first time conference was held in London in 1956.
The others researchers continued to join in the field as the Association for Machine translation and the computational linguistic formed in 1966 and also the national academy of science was formed the Automatic language processing Advisory committee for the study of Machine translation in 1966 but its found that the long researcher of ten years had failed to fulfill expectations, and funding was greatly reduced.
In 1970 the French Textile has used Machine translate to translate abstract from into French, English, German and Spanish. in 1971 Brigham Young University has started a project to translate Mormon texts and automated translation and other researchers Xerox used SYSTRAN to translate Technical manuals in 1978.But in 1984 various researcher were launched including Trades and it was the first to develop and market translation memory technology in 1989.the first commercial system has started and developed for Russia, English, Germany, Ukranian at Kharkov State University in 1991.
In early 1946 A.D.Booth was proposed the idea of using digital computers for translation of natural language. These ideas also propose of others as well. In 1949 Warren Weaver has written an important memorandum “Translaton”.The experiment of Georgetown was the first such application and also demonstration was made on the APEXC machine at Birbeck college which is university of London of rudimentary translation of English into French. A few of papers on the topic were published at that time and even articles in popular journals.
the procedure of translation lies a complex cognitive opration.the meaning of the source text and in its entirety, the translator must interpret and analyze all the text features, and a process which requires in very depth knowledge of the grammer,semantics,syntax,idioms etc, the translator which needs the same in depth knowledge to re-encode the meaning in the target language.Therien which lies in machine translation is mean that how a computer will program that will understand a text like a person does and will create a new text in target language that sounds like that has been written by a person. This problem may be approached in a number of ways.
The Bernard pyramid showing in the picture:
The comparative depths of intermediary representation, Interlingua machine translation at the peak, followed by transfer-based, then direct translation.
Machine translation which can use a method based on linguistic rules that means the words will be translated in the way of linguistic rules – the most suitable words of the target language will replace that in source language. The researcher ostly argued that the success of translation machine that requires the problem of natural language understanding which will be solved first.
The machine translator program work most of time well enough for a native speaker of one language to get the average of what is written from others native speaker. The difficulty is to get the enough date of the right kind to support for the particular method. for the example the large multilingual corpus data needed for statically methods to work which not apply for the grammar based methods. But the grammar based which need a full skilled linguist to design the grammar that use. A technique referred to as Transfer-based machine translation system which translates close related language.
Interlingua machine translation approach of rule based machine and it one instance. In the source language example like the text to be translated, its transformed into an interlingual.The target language is then operated out of the Interlingua. The machine translation can be used a based on dictionary which means that the words will be translated by a dictionary.
Statistical machine which try to generate the statically method based on bilingual text corpora like as Canadian Hansard corpus that the English –French record of the Canadian parliament. Good result can be achieved where such corpora available but it is still rare for many language pairs.CANDIDE from IBM was the first statically machine.
SYSTRAN are use in Google for few years but its switched to a statically method in October 2007.google has improved its capabilities by using approximately 200 billions words to train their system from UNITED NATIONS. In 1984 Makoto Nagao was proposed of Example based machine translation. It is often described by its use of a bilingual corpus as its main knowledge base. It is essentially translate by analogy and viewed as an implementation of case based reasoning that approach of machine learning.
Several Machine translation organization claim couples approach which is Rules and statiscal.The approaches differ in a number of ways:
- Rules post-processed by statistics: Translation machine perform using a rules based engine and statics are then intend to adjust or correct the output from the engine..
- Statistics guided by rules: Rules are used to pre-process data in an attempt to better guide the statistical engine. Rules are also using post process the statistical output to perform functions like as normalization. Rules are also used to post-process the statistical output to perform functions such as normalization. This perform has flexibility, control and a lot of power after translating. This approach has a lot more power, flexibility and control when translating.
The concern of Word –sense disambiguation found a suitable translation when a word would have more than one meaning. In 1050 Yehoshua Bar-Hillel has found the problem. He noted that a machine would never be able to distinguish between the two meanings of a word without a universal encyclopedia. There are numerous approaches designed to overcome this problem. They were divided into shallow approaches and deep approaches. He pointed out that without a “universal encyclopedia”, a machine would never be able to distinguish between the two meanings of a word.
Shallow approaches the text without knowledge and they just simply apply statiscal methods to the word surrounding the ambiguous word. Deep approach performs a comprehensive knowledge of the word but finally we can see that shallow approaches are successful more than Deep approaches.
Author:
Md Shaiful Islam
Technology Reporter