Skills:
SQL, ETL, Python, cloud, Google BigQuery, goggle cloud dataflow,
Immediate Job Opportunity for Data Engineer
Role
Are you a passionate developer or data enthusiast ready to make a difference in healthcare? Netmeds, India's leading online pharmacy, is expanding its Chennai tech hub and seeking talented individuals for multiple roles:
Interested candidates can share their resumes to sujana.dv@ril.com , dv.sujana@c2info.com
Role: Data Engineer Python SQL BigQuery
Experience : 2 to 6 Years
Mode of Hire: Permanent
Location: Chennai (Work from offfice)
Number of Positions: 2
Mode of Interview: F2F/ Video Call
Job Description
- Data Pipeline Development: Design, build, and maintain efficient and scalable data pipelines to extract, transform, and load (ETL) data from various sources (e.g., transaction systems, marketing platforms, customer databases) into our BigQuery data warehouse.
- Data Modeling and Warehousing: Develop and optimize data models and schemas to ensure data accuracy, integrity, and accessibility for analysis.
- Dashboard and Report Creation: Utilize Tableau to create intuitive and insightful dashboards and reports that visualize key performance indicators (KPIs), trends, and customer behavior patterns.
- Query Optimization: Optimize complex SQL queries to improve performance and efficiency of data processing.
- Collaboration: Work closely with data scientists, analysts, and business stakeholders to gather requirements, understand business needs, and provide data-driven solutions.
- Performance Monitoring: Monitor and troubleshoot data pipelines, dashboards, and reports to ensure optimal performance and data quality.
- Technology Exploration: Stay abreast of emerging technologies and industry best practices in data engineering and analytics.
Required Skills
- Bachelor's or Master's degree: In Computer Science, Data Science, or a related field.
- Proven experience: 3+ years of experience in data engineering, preferably in an e-commerce or retail environment.
- SQL: Expertise in writing complex SQL queries for data manipulation and analysis.
- BigQuery: Extensive experience in using BigQuery for data warehousing and processing.
- Tableau: Proficiency in creating visualizations, dashboards, and reports in Tableau.
- Data Pipelines: Familiarity with ETL tools and processes (e.g., Apache Airflow, Google Cloud Dataflow).
- Programming: Familiarity with Python or other scripting languages for automation and data manipulation.
- Problem-solving: Strong analytical and problem-solving skills with the ability to troubleshoot and resolve data-related issues.
- Communication: Excellent verbal and written communication skills for collaborating with cross-functional teams.