Location: Europe/USA

Position Overview: The Data Engineer is responsible for developing, constructing, testing, and maintaining architectures such as databases and large-scale data processing systems. This role involves gathering and processing raw data, designing and building data pipelines, and ensuring data is accessible, reliable, and secure for business analytics and decision-making. The Data Engineer will collaborate with data scientists, analysts, and other stakeholders to implement robust and scalable data solutions.

Key Responsibilities:

  1. Data Pipeline Development:
    1. Design, build, and maintain scalable data pipelines to support data integration, transformation, and consumption.
    1. Develop ETL (Extract, Transform, Load) processes to gather data from various sources, transform it, and load it into data warehouses or other storage systems.
    1. Optimize data pipelines for performance, reliability, and scalability.
  2. Data Modeling and Architecture:
    1. Design and implement data models that support business requirements and data analytics needs.
    1. Create and maintain database schemas, tables, and indexes to ensure efficient data storage and retrieval.
    1. Develop and maintain documentation related to data architecture, models, and pipelines.
  3. Data Integration:
    1. Integrate data from various sources, including databases, APIs, and third-party systems, ensuring data consistency and quality.
    1. Work with data scientists and analysts to provide clean, structured data for analysis and machine learning models.
    1. Implement data quality checks and validation procedures to ensure data integrity.
  4. Collaboration and Communication:
    1. Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions.
    1. Communicate technical concepts and solutions to non-technical stakeholders in a clear and understandable manner.
    1. Provide support and troubleshooting for data-related issues.
  5. Performance Optimization and Monitoring:
    1. Monitor and optimize the performance of data pipelines and databases to ensure efficient data processing.
    1. Implement logging, monitoring, and alerting systems to proactively identify and resolve data pipeline issues.
    1. Conduct performance tuning and query optimization to improve data retrieval times.
  6. Data Security and Compliance:
    1. Implement data security best practices to protect sensitive data and ensure compliance with data privacy regulations.
    1. Manage access controls, encryption, and data masking to safeguard data.
    1. Stay current with industry standards and emerging technologies related to data security and compliance.

Qualifications:

  • Education:
    • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field. An advanced degree is preferred.
  • Experience:
    • Minimum of 4-8 years of experience in data engineering, data warehousing, or a related field.
    • Proven experience with data modeling, database design, and data pipeline development.
    • Experience working with large-scale data processing systems and big data technologies.
  • Technical Skills:
    • Proficiency in programming languages such as Python, Java, or Scala.
    • Strong knowledge of SQL and experience with relational databases (e.g., MySQL, PostgreSQL, SQL Server).
    • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud data platforms (e.g., AWS, Azure, Google Cloud).
    • Experience with ETL tools and frameworks (e.g., Apache Nifi, Informatica, Databricks, Talend, Airflow).
    • Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery).
  • Certifications:
    • Relevant certifications such as AWS Certified Big Data – Specialty, Google Cloud Professional Data Engineer, or similar are a plus.

Key Competencies:

  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Ability to manage multiple tasks and projects simultaneously.
  • Attention to detail and a commitment to data quality.
  • Adaptability and a continuous learning mindset.

Apply for this position

Allowed Type(s): .pdf, .doc, .docx