Jobs and vacancies near you on InRadius.

All Jobs

Sourced Job
Cloud Operations Engineer

Cloud Operations Engineer

Marathahalli, Bangalore

3 years

About the job Monitor and manage AWS infrastructure across multiple accounts and regions. Provide 24/7 operational support through rotational shifts, including weekends and holidays. Respond to alerts, troubleshoot issues, and perform root cause analysis. Implement and maintain automation scripts using tools like CloudFormation, Terraform, or AWS CDK. Manage EC2 instances, S3 buckets, RDS databases, Lambda functions, and other AWS services. Ensure compliance with security policies and best practices (IAM, KMS, VPC, etc.). Collaborate with DevOps, Security, and Application teams to support deployments and upgrades. Maintain documentation for infrastructure, processes, and incident resolution. Participate in disaster recovery planning and execution. Continuously optimize cloud resources for cost and performance. You will be successful in this role if you have: Required Skills & Qualifications Bachelor's degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience with AWS cloud services. Strong understanding of networking concepts (VPC, Subnets, Route Tables, NAT, VPN). Experience with monitoring tools (CloudWatch, Datadog, Prometheus, etc.). Proficiency in scripting languages (Python, Bash, PowerShell). Familiarity with CI/CD pipelines and tools (Jenkins, GitLab, CodePipeline). Knowledge of Infrastructure as Code (IaC) tools. AWS certifications (e.g., Solutions Architect Associate, SysOps Administrator). Excellent problem-solving and communication skills. Willingness to work in rotational shifts including nights and weekends. Preferred Qualifications Experience in multi-cloud environments (Azure, GCP). Exposure to container technologies (Docker, ECS, EKS). Understanding of ITIL processes and incident management tools (ServiceNow, Jira). Shift Details Rotational shifts covering 24/7 operations. On-Call support as needed. Flexibility to adapt to changing schedules.

AWS Server ManagementAmazon EC2Amazon S3Amazon Relational Database Service (RDS)+16

Sourced Job
Fullstack Developer

Fullstack Developer

Marathahalli, Bangalore

2 years

Technical Requirements Backend development using Java/Spring Boot UI development with React or AngularJS Experience with Oracle, DB2, MySQL, or similar databases Strong understanding of REST APIs, JSON, XML Ability to design/manage complex data structures and data workflows Experience working in Agile/Scrum teams Familiarity with CI/CD, automated testing, and test-data–driven development Some exposure to Kafka or streaming platforms Knowledge of Docker, Kubernetes, and AWS is a plus Strong learning mindset and self-motivation Key Responsibilities Design and build systems used by 40% of the world’s population Contribute to Visa’s standards for scalability, security, and reusability Collaborate with cross-functional teams to create design documents and develop software solutions Improve product quality and support new business flows in agile squads Build scalable products for merchants, B2B clients, and government solutions Use innovative technologies for payment services, real-time payments, transaction platforms, and BNPL products Participate in mentorship and continuous learning programs Qualifications Basic understanding of project requirements and ability to ask clarifying questions Ability to review solution strategies and improve product feature design Proficiency in Java/Python to write code under guidance Support pilot projects involving new tech to enhance user experience Identify and escalate bugs that affect website functionality Nice to Have Awareness of new technologies to enhance architectures Strategic thinking and good business acumen Experience in the payment industry What They Offer Opportunity to work on cutting-edge, high-impact projects Motivated and collaborative team environment Competitive salary Flexible schedule Medical insurance and sports benefits Corporate social events Professional development support Well-equipped office

JavaSpring BootReact JSAngularJS+16

Sourced Job
AI/ML Engineer

AI/ML Engineer

Marathahalli, Bangalore

3 years

Role Overview: As an AI/ML Engineer at Plivo, you’ll play a hands-on role in building and scaling production-grade AI models that power our global communications platform. Working closely with product and engineering teams, you’ll design, train, and deploy models that solve real-world problems in speech, language, and voice automation at scale. This is a high-visibility, high-impact opportunity perfect for analytical, curious individuals who want to contribute meaningfully from Day 1. Key Responsibilities Train, fine-tune, and deploy AI/ML models for use cases like speech recognition, speaker isolation, and turn detection across languages and verticals. Build scalable inference pipelines and integrate models seamlessly into the production environment. Optimize models for latency, accuracy, and throughput for real-time, global-scale AI. Analyze large datasets to identify patterns, surface improvements, and drive model performance. Collaborate cross-functionally with engineering, product, and data teams to deliver production-ready AI features. Explore and implement open-source frameworks, build internal AI tooling, and stay ahead of the curve. Stay current with the latest trends in LLMs, generative AI, and voice intelligence because we’re always pushing the frontier. What We’re Looking For B.Tech in Computer Science, AI/ML, Data Science, or a related field from a top engineering school. Strong theoretical understanding of ML algorithms, deep learning, and model training. Proficiency in Python and experience with frameworks like PyTorch, TensorFlow, or Hugging Face Transformers. Hands-on experience building models (e.g., NLP, ASR, TTS, embeddings, voice agents); Kaggle or competition experience is a strong plus. Solid experience in data processing, feature engineering, and model evaluation techniques. Analytical mindset, bias for action, and excellent communication and collaboration skills. Nice to Have Experience working in real-time systems and deploying models in production. Familiarity with MLOps tools and pipelines. Exposure to speech, voice, or multimodal datasets. Contributions to open-source ML projects. Experience building models from scratch.

Python coursePyTorch (Machine Learning Library)TensorFlowHugging Face Transformers+16

Sourced Job
QA Engineer Automation

QA Engineer Automation

Marathahalli, Bangalore

3 years

About the job Anko is the global capability centre for Kmart Group Australia, fuelling growth aspirations of iconic Australian retail brands Kmart, Target and Anko. Based in Bangalore, India, we strive to accelerate retail innovation by building competitive capabilities in Technology, Data Sciences and Business Services that enable our brands to deliver delightful experiences to our in-store and online customers Primary Purpose Of The Role Collaborate with QA teams to develop effective test strategies and test plans. Develop, document, and maintain functional test cases and other test artifacts like the test data, data validation, harness scripts and automated scripts. Technical Skills Karate, Gatling, Cucumber, BDD AWS/Cloud Capabilities ITSQB Foundation Certification Traits/Abilities Excellent written and verbal communication skills Excellent analytical skills with attention to detail Good self-organisational skills Work Experience 3+ years’ experience working in Modern architectures (eg. Microservices, event streaming, single page apps, containers, serverless, infrastructure as 3+ years’ experience in working with Agile frameworks and tracking of test cases and defects using JIRA and other test management software. Proven experience as a QA tester or a similar Tester role Experience in working with Agile frameworks and tracking of test cases and defects using JIRA and other test management software

KarateGatlingCucumberBehavior-Driven Development+16

Sourced Job
Data Engineer

Data Engineer

Marathahalli, Bangalore

3 years

Responsibilities Design and maintain scalable data pipelines and ETL/ELT processes for structured and unstructured data. Build robust data models, data lakes, and data warehouses to support analytics and ML use cases. Collaborate with software engineers, data scientists, and business teams to deliver production-ready data solutions. Ensure data quality, governance, security, and lineage across all data assets. Automate data workflows and implement monitoring/alerting for operational excellence. Apply engineering best practices such as CI/CD, testing, code reviews, and performance optimization. Work with ML engineering teams to develop ML Ops pipelines for model deployment, monitoring, and scalability. Optimize cloud-based data platforms (AWS, Azure, GCP) for cost, performance, and security. Create documentation, standards, and guidelines to improve data engineering processes. Required Skills & Qualifications Strong understanding of data engineering fundamentals: data modeling, pipelines, batch/streaming systems. Experience with Kafka, Spark, Flink, or similar streaming technologies. Proficient in cloud services (AWS/Azure/GCP) and cloud-native tools like BigQuery, Redshift, Synapse, Databricks, Snowflake. Skilled with modern orchestration frameworks (Airflow, Prefect, Dagster). Strong coding ability in Python, SQL, and one more language (Scala/Java preferred). Experience with CI/CD, version control, testing, observability, and performance tuning. Knowledge of operational excellence (SRE principles, monitoring, alerting). Familiarity with ML Ops tools such as MLflow, Kubeflow, Vertex AI, or Azure ML. Understanding of Docker, Kubernetes, and containerized workflows. Knowledge of data governance, security, access control, and GDPR/PII compliance. Preferred Qualifications Experience designing large-scale data platforms for analytics and ML workloads. Exposure to real-time streaming architectures. Understanding of DevOps principles applied to data and ML pipelines. Strong problem-solving skills focused on scalability and reliability. Excellent communication skills and experience working with global cross-functional teams.

Data ModelingETL TestingBig Data ProcessingCloud Services+16

Sourced Job
Software Developer

Software Developer

Marathahalli, Bangalore

3 years

About the job The person in this position is primarily responsible for contributing to the development of Creaform’s software. Within an agile team, they are responsible for the software architecture, user interfaces, user experience, interactions with online services, and interactive 3D visualization tools. Tasks & Responsabilities Develop ergonomic user interfaces for application software operating on Windows; Develop the software architecture necessary to support these applications and the operation of Creaform’s 3D scanning technologies; Experimentally test and validate the developed software; Interact with the testing and support team to ensure product quality; Adhere to the established processes, work methods, and development standards. Requirements University degree in computer engineering, software engineering, or computer science; Experience/knowledge in the following areas: C++, C#, WPF programming Visual Studio, Git development environment Computer architecture 3D geometry and matrix calculations, an asset OpenGL programming, an asset Fluent in English

C++C#WPFMicrosoft Visual Studio+16