TK

Tarun Aditya Kusupati

Computer ScienceUniversity of Dayton

United States

AboutEventsChannelsCommunities

Hi, I'm Tarun Aditya Kusupati!

Research Intern at Ajeenkya DY Patil University

Results-oriented and highly skilled Deep Learning Engineer with expertise in developing and optimizing advanced multilingual neural machine translation and transliteration models. Proven ability to leverage GPU acceleration and implement custom transformer architectures using frameworks like Fairseq, achieving exceptional BLEU scores. Adept at conducting rigorous benchmarking and achieving outstanding performance on FLORES test sets. Strong proficiency in Python programming, neural network design, and GPU optimization. Accomplished in scientific computing, statistical analysis, and machine learning, as evidenced by the application of advanced techniques in an honors-level Applied Data Science program. Excellent problem-solving skills, with a keen focus on continuous learning and staying updated with the latest advancements in the field. Effective communicator and collaborative team player, ready to contribute technical expertise to challenging projects.

Socials

Socials

Education

Education

Experience

Experience

Projects

Projects

Low Resource Multilingual Neural Machine Translation

Research Intern

  • Diligently researched and crafted neural network architectures, fine-tuned for resource-constrained environments, laying the groundwork for an advanced multilingual neural machine translation model.

  • Executed and refined a sophisticated model with ~112 million parameters leveraging Fairseq and a customized transformer architecture in Python, achieving optimal performance with seamless efficacy on a consumer-grade GPU.
  • Rigorously benchmarked and analyzed translation performance over a spectrum of language pairs using the FLORES test set, attaining exceptional results and a commendable focus on resource efficiency, with an average BLEU score of 10.98 for English to Indic languages and 9.69 for Indic to English translations.

Low Resource Multilingual Neural Machine Transliteration

Research Intern

  • Expertly researched and strategically selected cutting-edge neural network architectures tailored for low-resource environments to underpin a robust multilingual neural machine transliteration model.
  • Skillfully implemented and meticulously fine-tuned a high-performance model using Fairseq and a bespoke transformer architecture in Python, achieving seamless integration and optimal efficiency on a consumer-grade GPU. The compact model boasts a 16-million-parameter count while delivering exceptional results.
  • Vigorously conducted benchmarking and performed comprehensive evaluations of transliteration efficacy across varied language pairs using the Aksharantar test set, securing remarkable outcomes with a keen emphasis on resource efficiency, evidenced by an impressive average Top-1 accuracy of ~80% for English to Indic language translations.
Languages

Languages

English

Professional

Hindi

Intermediate

Skills

Skills

Artificial Intelligence

Machine Learning

Deep Learning

NLP, Natural Language Processing

Python

PyTorch

Docker

Localized connects university students and recent graduates with industry experts and employers.

ProductStudentsEmployersUniversities
Download appiOS mobile appAndroid mobile app

PrivacyTermsSitemap

©2024 Localized, Inc. All rights reserved.

Ready for a personalized experience? We use cookies and similar technologies to tailor our site just for you. By clicking 'Accept', you're giving us the thumbs up to use cookies and similar technologies. 🍪