Data Engineer is responsible for helping to select and implement the tools and processes required of a data processing pipeline in support of the customer requirements. The Data Engineer the primary responsibilities include implementing ETL (extract, transform and load) pipelines, monitoring/maintaining data pipeline performance. The Data Engineer in proficient in distributed computing principles and familiar with key architectures including Lambda and Kappa architectures, and has a broad experience across a set of data stores (e.g., SQL, PostgreSQL, MongoDB, InfluxDB, neo4j, Redis) messaging systems (e.g., Apache Kafka, RabbitMQ) and data processing engines (e.g., BigQuery, Redshift, Snowflake) The ideal candidate has three or more years’ experience working on solutions that collect, process, store and analyse huge volume of data, fast moving data or data that has significant schema variability.
Location: Coimbatore, Chennai, Bangalore
Requirement: B.E, B.Tech, MCA, M.E, M.Tech
- 5+ years of experience building advanced analytics (including machine learning) solutions.
- 5+ years of experience with one or more scripting languages, such as R, Python, Scala, or SQL.
- 5-10 years of experience building data pipelines to operationalize end-to-end solutions.
- 5+ years of experience building advanced analytics (including machine learning) solutions
- 5+ years of experience in data analytics and data mining with proven quantitative orientation.
- 5+ years of working on complex reporting requirements, large, complex data sets, and various reporting tools, such as Tableau, Power BI.
- 4+ years of demonstrated ability to deliver high-quality reporting metrics to customers and executives.
- 4+ years of proven ability to judge data results as valid and accurate.
- 5+ years of experience delivering proven database modernization solutions.
- 5+ years of experience in ingesting and performing advance analytics on data from multiple sources, including batch analytics, interactive analytics, real-time/streaming analytics.
SQL, PostgreSQL, BigQuery, Redshift, Snowflake, MariaDB, MongoDB, InfluxDB, neo4j, Redis, Elasticsearch
Programming/Scripting Languages: Scala, SQL, Python, MapReduce
Cloud: Azure, AWS, GCP
- Proven ability to develop work in environments following Agile methodologies.
- Proven track record of driving decisions collaboratively, resolving conflicts & ensuring follow through.
- Presentation skills with a high degree of comfort with both large and small audiences.
- Prior work experience in a consulting/architecture position within a software & services company.
- Problem-solving mentality leveraging internal and/or external resources.
- Exceptional verbal and written communication