Are you passionate about crafting efficient data processing solutions and thrive in a collaborative, dynamic environment? We are searching for a talented Data Engineer to contribute to our cross-functional teams, transforming data requirements into robust, scalable solutions.
• Data Pipeline Maestro: Design, develop, and maintain robust data pipelines, seamlessly handling the ingestion, processing, and transformation of large volumes of structured and unstructured data.
• Integration Expert: Implement data integration processes across various systems and data sources, ensuring data consistency and accuracy throughout the ecosystem.
• Performance Guru: Optimize data pipelines for stellar performance, scalability, and reliability, always adhering to industry best practices.
• Troubleshooting Pro: Tackle challenges head-on by troubleshooting and resolving issues related to data pipeline performance, data quality, and system failures.
• Tech Explorer: Stay ahead of the curve by exploring and evaluating new big data technologies and tools, enhancing our overall data infrastructure.
• Documentation Aficionado: Develop and maintain comprehensive documentation for data pipelines, processes, and solutions, ensuring knowledge accessibility and transfer.
• Collaborative Spirit: Work closely with data scientists, analysts, and other stakeholders to fulfill diverse data-related requirements.
• Educational Background: Bachelor’s degree in Computer Science, Engineering, or a related field, or showcase equivalent work experience.
• Proven Expertise: Demonstrate your experience as a Data Engineer, showcasing your skills in designing, building, and maintaining data pipelines.
• Coding Fluency: Proficiency in programming languages such as Java, Scala, and Python is essential to excel in this role.
• Big Data Know-How: Possess a strong understanding of big data technologies and concepts, including Hadoop, Spark, Kafka, and related ecosystems.
• ETL Mastery: Showcase your experience with data modeling, ETL processes, and data warehousing concepts.
• Cloud Savvy: Exhibit familiarity with cloud platforms such as AWS, GCP, or Azure, and their respective data services.
• Experience with data tech tools such as Altair Monarch and RapidMiner and Altair SLC is a plus
If you’re a driven professional seeking a new challenge and eager to make a significant impact in the data engineering domain, apply today and become a vital part of our innovative team.
Don’t miss the opportunity to shape the future. Apply now.