Description
NTT DATA is a trusted global innovator of business and technology services, helping clients innovate, optimize, and transform for success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem.
Kindly share the database in the excel sheet.
Eligibility criteria:
BE/BTech& MTech 2023/2024 Pass out
60% throughout education.
Stream – AI/ML, CSE, IT/IS
Indian nationality
Student information:
Students must register and provide all information requested in the link below.
Students should meet basic eligibility criteria to be considered for the interview process.
Aadhar and PAN cards are mandatory for participation.
Job Description:
Hands on development of AI/ML /GenAI solutions using Python or R
Design and build Deep Learning models/advanced GenAI plugins and services that streamline workflows and develop novel solutions to tackle complex challenges
Build , maintain, deploy, and support the internal GenAI platform LLM /Langchain /OpenAI integration (API/SDK’s, services, platform backend, integration with third-party models, integration with third-party vendors, etc.)
Conduct code reviews and PR reviews for contributions from within the team and from the firm’s inner source contributors
Collaborate with our DevOps team to develop and maintain efficient CI/CD pipelines and processes, enhancing our development lifecycle
Collaborate with our Cloud Engineering team to build and maintain a multi-cloud platform offering and leverage model offerings across multiple cloud providers, while adhering to industry best practices
Collaborate with our research, product, data and UI/UX teams to ensure requirements are being met in the platform build-out
Collaborate on design and implementation of best practices for Vector DB architecture, including data modeling, entitlements, indexing, and query optimization
Stay ahead of the curve in generative AI and LLMs and collaborate with our research team to ensure that we are offering capabilities on the platform in a timely manner
Produce clear and detailed design and technical documentation for our APIs, features, tools, and other platform capabilities, ensuring transparency, accessibility, and ease of use
Good understanding of ANY one below technologies
AI/ML with Python
Primary Cloud Platform: Microsoft Azure.
Azure Services: Azure OpenAI, Azure AI Services, Cognitive Services.
Secondary Cloud Platforms: AWS , GCP .
AWS Services: Amazon Bedrock, Amazon SageMaker, AWS AI Services (Lex, Comprehend, Polly, Rekognition).
GCP Services: Google Vertex AI, Google Cloud AI, Google Cloud Natural Language, Speech-to-Text and Text-to-Speech,Vision AI
LLM Experience: OpenAI, LLaMA, Grok, Anthropic, Gemini, Bedrock, Coherent, Mistral; Hugging Face Transformers, TensorFlow.
Custom model development text summarization and question answering, LLM evaluation using BLEU and ROUGE metrics.
Programming Proficiency: Python, SQL.
AI/ML Expertise: Cognitive services, NLP, generative AI.
Frameworks and Tools: Langchain, LLaMA Index, Vector DBs like PineCone.
Front-end Skills: REACT, Angular, Streamlit, Chainlit, Flask.
ML Techniques: Supervised, Unsupervised Learning, Advanced NLP and Computer Vision.
LLMOps and DevOps: Experience with LLMOps and DevOps practices, including CI/CD pipelines, containerization, and orchestration.
Compensation:
During training period, a stipend of 20K PM will be paid.
Post Training the compensation is 8 LPA + incentives.
Training Duration: 60 to 90 Days