Our Office
Whitefield, Bangalore, 560066
Email Us
contact@actiniumsoftwares.com
Call Us
634 675 8725
Join Our Team

Open Positions

Job Location: Bangalore (WFO)

Full Stack Developer (Java)

1. Essential Duties and Responsibilities

  • Develop and support scalable, extensible, and highly available data pipelines on heterogeneous datasets that power downstream applications and systems and serve content to our web and API products.
  • Closely collaborate with partners across product and design, engineering, and business teams to drive innovations that improve our customers’ experience.
  • Follow software-development best practices including test-driven development, contributing to documentation, feature flagging, etc.
  • Help to maintain and improve existing ETL pipelines.
  • Work with your team to troubleshoot and fix issues in ingest and processing, considering dependencies and integration points.
  • Collaborate with DevOps to plan resources and continuously optimize the infrastructure and configuration of our data pipelines to ensure a healthy and high-performance production deployments.

2. Skills And Experience

  • Strong expertise in advanced data modeling, schema and ETL process design, implementation, and maintenance.
  • Experience with numerous data lake / warehouse technologies (Databricks, Snowflake, Presto, Dremio).
  • Strong knowledge of data lake best practices.
  • Strong background in advanced SQL and focused on understanding, manipulating, processing and extracting values from large datasets and data streams.
  • Cloud experience (Azure or AWS preferred).
  • Advanced Python skills or Java skills.
  • Experience with different variety of data types (JSON, Parquet, Excel, Flat files).
  • Experience with databases and data storage frameworks including: Microsoft SQL, Postgresql, Elasticsearch, Mongo, Cosmos DB, Delta Lake.
  • Expertise in cloud messaging platforms including Apache Kafka or Azure Service Bus.
  • Solid comprehension of common design patterns, algorithms, and data structures.
  • Working knowledge of containerization and modern cloud deployments including Docker and Kubernetes.
  • Bachelor’s degree in computer science, or a related field.
  • Excellent communication and presentation skills.

Python Developer

1. Role Description

  • This is going to be a kind of a solution providing an interface to the underlying ML based solution. Need to deploy the developed solution on an AWS instance.
  • Since most the ML and deep learning solution (there are popular deep learning frameworks like TensorFlow etc.) have python interface, so the ML solutions sit on a Python based framework. Django is a mandate and someone who has experience in developing restful API using Django is needed.
  • The same will be deployed on AWS so AWS is a must (EC2 or Lambda etc.).
  • All the solutions will be providing data to a SAC-P or SAP HANA, so some front-end technologies are required for Widget creation and that can be done through like JavaScript/TypeScript, along with HTML, CSS, jQuery etc.

2. Skills

  • Python
  • Cloud - AWS (EC2, Lambda)
  • Frameworks like Django, REST
  • Database like SQL
  • Front-end: JavaScript/TypeScript, HTML, CSS, jQuery

Data Engineer

1. Essential Duties and Responsibilities

  • Develop, deploy, and support real-time, automated, scalable data streams from a variety of sources into the data lake or data warehouse.
  • Develop and implement data auditing strategies and processes to ensure data quality; identify and resolve problems associated with large scale data processing workflows; implement technical solutions to maintain data pipeline processes and troubleshoot failures.
  • Collaborate with technology teams and partners to specify data requirements and provide access to data.
  • Tune application and query performance using profiling tools and SQL or other relevant query language.
  • Understand business, operations, and analytics requirements for data.
  • Build data expertise and own data quality for assigned areas of ownership.
  • Work with data infrastructure to triage issues and drive to resolution.

2. Required Qualifications

  • Bachelor’s Degree in Data Science, Data Analytics, Information Management, Computer Science, Information Technology, related field, or equivalent professional experience.
  • Overall experience should be more than 7+ years.
  • 3+ years of experience working with SQL.
  • 3+ years of experience in implementing modern data architecture-based data warehouses.
  • 2+ years of experience working with data warehouses such as Redshift, BigQuery, or Snowflake and understand data architecture design.
  • Excellent software engineering and scripting knowledge.
  • Strong communication skills (both in presentation and comprehension) along with the aptitude for thought leadership in data management and analytics.
  • Expertise with data systems working with massive data sets from various data sources.
  • Ability to lead a team of Data Engineers.

3. Preferred Qualifications

  • Experience working with time series databases.
  • Advanced knowledge of SQL, including the ability to write stored procedures, triggers, analytic/windowing functions, and tuning.
  • Advanced knowledge of Snowflake, including the ability to write and orchestrate streams and tasks.
  • Background in Big Data, non-relational databases, Machine Learning and Data Mining.
  • Experience with cloud-based technologies including SNS, SQS, SES, S3, Lambda, and Glue.
  • Experience with modern data platforms like Redshift, Cassandra, DynamoDB, Apache Airflow, Spark, or ElasticSearch.
  • Expertise in Data Quality and Data Governance.