(Senior) Data Engineer
Chennai, IN
(Senior) Data Engineer
ROLE DESCRIPTION SUMMARY
Space Programs software development team focuses on development of software solutions supporting current, next-generation MEO/GEO space and ground assets. The global team consists of system architects, software engineers, software architects, test & integration engineers. We are looking for a Data Engineer to join our team. As part of the team, you will play a key role in design, development, and validation of multiple data solutions for MEO & GEO space programs. You are ideally someone who can thrive in a high-pace environment, have a curious mind, and like learning new things. Plus, you’re a true team player who values collaboration, knows that the best ideas come from working together, and loves helping others succeed just as much as yourself!
PRIMARY RESPONSIBILITIES / KEY RESULT AREAS
• Design and build robust data architectures using Azure Data Services, including Azure Data Lake, Azure Synapse Analytics, and Azure SQL
• Implement and optimize real-time data pipelines using Apache Spark, Databricks, and Kafka
• Develop, test, and deploy data solutions using Python and CI/CD pipelines
• Manage infrastructure as code with Terraform for seamless deployments
• Desirable certifications,
o Microsoft Certified: Azure Data Engineer Associate
o Microsoft Certified: Azure Solutions Architect Expert
o Databricks Certified Associate Developer for Apache Spark
• Coordinate and share information and knowledge with other development, test and operations teams, fostering effective communication
• Actively participate in knowledge-sharing sessions with team members and subject matter experts, con-tributing to a collaborative and learning-oriented environment
• Document input from stakeholders and operations, showcasing the developed solutions and addressing any inquiries to ensure customer satisfaction
• Maintain comprehensive technical documentation, capturing key aspects of the software development process and system architecture for future reference and knowledge transfer
• Foster and enhance culture of collaboration and excellence in everything
• Nice to have, Experience with machine learning frameworks and AI tools in Azure
COMPETENCIES
• Demonstrates a strong ability to work independently, aligning work with high-level objectives and long-term goals.
• Exhibits high motivation and adept prioritization skills, consistently delivering results within tight dead-lines while working towards overarching objectives.
• Works autonomously and proactively takes initiatives when necessary.
• Possess outstanding communication and presentation skills, conveying complex technical information to both technical and non-technical audiences.
• Demonstrates a quick and decisive approach to problem-solving, addressing challenges promptly and effectively.
• Displays effective intercultural awareness, contributing to a collaborative and inclusive work environ-ment.
• Proactively fosters a mindset of helping others succeed, evidenced by a track record of mentoring and supporting team members in achieving their goals.
QUALIFICATIONS & EXPERIENCE
• Bachelor or Master’s degree in Computer Science, Software Engineering, or a related technical field
• Certifications such as Azure Data Engineer Associate, Databricks Certified Associate Developer are highly desirable.
• 7+ years of experience in data engineering roles, with a strong focus on designing and implementing large-scale data pipelines and solutions using Azure services
• Expert in developing and deploying data pipelines for processing data from streaming and batch data sources
• Hands-on experience setting up databricks environment in Azure, spark clusters, Unity Catalog, delta live tables, scalable and sustainable data engineering jobs
• Experience designing data flows, building data strategy and documenting the process
• Expert knowledge in Azure Datalake, Azure Data Factory, Azure Log Analytics, Delta Lake, Kafka, Struc-tured Streaming, DataFrame API, SQL, NoSQL Database
• Experience with performance tuning, use of CLI and REST API, cost optimization for Databricks com-pute and storage
• Experience working in Azure services like Azure Key Vault, Azure Functions, Azure VNET
• Experience designing and developing database tables and/or schemas using SQL/No-SQL;
• Experience monitoring and maintaining the data pipelines and jobs
• Experience setting up data zones in delta lake and implementing fine grained access control
• Experience working in Power BI to setup data flows, create DAX queries, reports and dashboards with near real time updates
• Comfortable with API, Git, Notebooks, Spark Jobs, Performance tuning, Container-based deployments, Terraform
• Familiarity with Data Streaming Architecture, Azure Synapse, Azure NSG, data catalog, Azure Purview a plus
• Preferably Databricks Developers Associate Certified
• Proficient Python for data manipulation, automation, and scripting
• Demonstrated experience in designing end-to-end data architectures for long-term storage and real-time data streaming solutions
• Experience with data security, governance, and performance optimization on Azure
• Experience with SQL and NoSQL databases.
• Familiarity with Azure Machine Learning and AI services (preferred but not mandatory).
Embark on a career with us, where diversity isn't just a buzzword – it's our driving force. We are crafting a workplace mosaic that values every hue, background, and perspective. Join a global team where inclusivity sparks innovation, and individuality is not only embraced but celebrated. At SES we are committed to hiring inspiring individuals from all backgrounds. We take great pride in creating safe and inclusive processes and we support the recruitment, retention, and evolution of all employees irrespective of gender, colour, race, ethnicity, religion, sexual orientation, disability, veteran or marital status, background or walk in life.
SES is an Equal Opportunity Employer and welcomes diversity!
For more information on SES, click here.