← All Jobs
Posted Apr 15, 2026

AI Developer - Python + AWS Bedrock

Apply Now
This a Full Remote job, the offer is available from: Anywhere This is a remote position. Job Description: AI Developer (5+ Years Experience) Budget: 1L monthly Contract Duration- 12 months Work Mode- permanent Remote Yrs. Of Experience: 5+ yrs Immediate Joiners only Mandatory Skills : 1. Langchain 2. Aws lambda, Sns, sqs, api gateway 3. Fast api 4. Pg vector database 5. Postman 5. Git • Job Summary: We are seeking a skilled AI Developer with expertise in Python, AI/ML development, AWS Bedrock, and cloud computing. The ideal candidate should have experience working with machine learning frameworks, data engineering, and cloud infrastructure. This role requires strong problem-solving skills, AI model optimization capabilities, and hands-on experience in deploying scalable AI solutions. Job Responsibilities: 1. AI & Machine Learning Development: • Develop, train, fine-tune, and deploy machine learning models using TensorFlow, PyTorch, scikit-learn, and NumPy. • Work with Large Language Models (LLMs), Retrieval-Augmented Generation (RAG) Systems, NLP, and computer vision. • Integrate and utilize pre-trained AI models, including those in AWS Bedrock. • Optimize AI model performance, scalability, and efficiency. 2. Backend Development & API Integration: • Design and develop RESTful APIs using Flask or FastAPI for AI model deployment. • Implement robust microservices architecture for AI-based applications. • Ensure secure, efficient, and scalable backend AI solutions. 3. Cloud & DevOps: • Deploy AI solutions using AWS Bedrock and integrate with services like S3, Lambda, SageMaker, ECS/EKS, and IAM. • Manage cloud infrastructure provisioning, cost optimization, and security policies. • Work with CI/CD pipelines for seamless AI model deployment. • Utilize Docker and Kubernetes for containerized deployments. 4. Database & Data Engineering: • Design, query, and optimize relational (PostgreSQL, MySQL) and NoSQL (DynamoDB, MongoDB) databases. • Develop and automate data pipelines using Apache Airflow, AWS Glue, or Spark. • Implement ETL processes for AI data handling. 5. Collaboration & Documentation: • Collaborate with data scientists, engineers, and product managers to integrate AI into applications. • Write clear documentation for AI models, workflows, and deployment processes. • Explain AI concepts to non-technical stakeholders when required. Requirements Job Requirements: Essential Requirements: ✅ Programming & AI Development: • 5+ years of experience in Python development with AI/ML focus. • Proficiency in AI/ML frameworks like TensorFlow, PyTorch, and scikit-learn. • Experience in neural networks, NLP, LLMs, RAG, and computer vision. ✅ Backend & API Development: • Experience in RESTful API development using Flask or FastAPI. • Strong knowledge of OOP, functional programming, and microservices architecture. ✅ Cloud & DevOps: • Experience with AWS Bedrock and other AWS services (S3, Lambda, SageMaker, ECS/EKS, IAM). • Hands-on experience with containerization (Docker) and orchestration (Kubernetes). • Understanding of CI/CD pipelines and DevOps practices. ✅ Data Engineering & Databases: • Strong knowledge of PostgreSQL, MySQL, DynamoDB, and MongoDB. • Experience with data pipelines, ETL processes, and Apache Airflow/Spark/AWS Glue. ✅ Version Control & Collaboration: • Proficiency in Git and best practices for collaborative development. • Familiarity with project management tools like Jira or Trello. Soft Skills: ✔️ Strong problem-solving and analytical thinking. ✔️ Excellent communication skills to explain AI models and technical concepts. ✔️ Ability to work collaboratively in cross-functional teams. ✔️ Adaptability to learn and implement emerging AI/ML technologies. Salary: [email protected] offer from "Talpro - Leaders in Technology Hiring" has been enriched by Jobgether.com and got a 82% flex score.
Interested in this role?Apply on iHire