Company overview:

TechTorch is a leader in delivering innovative Enterprise Technology solutions, leveraging AI-powered accelerators to drive business success for Private Equity-backed companies. Our team of experts works to disrupt the system integration space with cutting-edge solutions. We are currently seeking a Accelerator Product Manager to manage the development and organization of our accelerators.

Position Overview:

The Data Engineer is responsible for developing, constructing, testing, and maintaining architectures such as databases and large-scale data processing systems. This role involves gathering and processing raw data, designing and building data pipelines, and ensuring data is accessible, reliable, and secure for business analytics and decision-making. The Data Engineer will collaborate with data scientists, analysts, and other stakeholders to implement robust and scalable data solutions.

Key responsibilities include:

Requirements Gathering and Analysis

  • Design, build, and maintain scalable data pipelines to support data integration, transformation, and consumption.
  • Develop ETL (Extract, Transform, Load) processes to gather data from various sources, transform it, and load it into data warehouses or other storage systems.
  • Optimize data pipelines for performance, reliability, and scalability.
 

Data Modeling and Reporting:

  • Design and implement data models that support business requirements and data analytics needs.
  • Create and maintain database schemas, tables, and indexes to ensure efficient data storage and retrieval.
  • Develop and maintain documentation related to data architecture, models, and pipelines.

 

Data integration:

  • Integrate data from various sources, including databases, APIs, and third-party systems, ensuring data consistency and quality.
  • Work with data scientists and analysts to provide clean, structured data for analysis and machine learning models.
  • Implement data quality checks and validation procedures to ensure data integrity.

 

Collaboration and Communication:

  • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions.
  • Communicate technical concepts and solutions to non-technical stakeholders in a clear and understandable manner.
  • Provide support and troubleshooting for data-related issues.

 

Performance Optimization and Monitoring:

  • Monitor and optimize the performance of data pipelines and databases to ensure efficient data processing.
  • Implement logging, monitoring, and alerting systems to proactively identify and resolve data pipeline issues.
  • Conduct performance tuning and query optimization to improve data retrieval times

 

Data security and compliance:

  • Implement data security best practices to protect sensitive data and ensure compliance with data privacy regulations.
  • Manage access controls, encryption, and data masking to safeguard data.
  • Stay current with industry standards and emerging technologies related to data security and compliance.

 

Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field. An advanced degree is preferred.
  • Minimum of 4-8 years of experience in data engineering, data warehousing, or a related field.
  • Proven experience with data modeling, database design, and data pipeline development.
  • Experience working with large-scale data processing systems and big data technologies.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Strong knowledge of SQL and experience with relational databases (e.g., MySQL, PostgreSQL, SQL Server).
  • Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud data platforms (e.g., AWS, Azure, Google Cloud).
  • Experience with ETL tools and frameworks (e.g., Apache Nifi, Informatica, Databricks, Talend, Airflow).
  • Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery).
  • Relevant certifications such as AWS Certified Big Data – Specialty, Google Cloud Professional Data Engineer, or similar are a plus.

Key competencies:

  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Ability to manage multiple tasks and projects simultaneously.
  • Attention to detail and a commitment to data quality.
  • Adaptability and a continuous learning mindset.

 

Why join TechTorch

TechTorch offers a dynamic and innovative environment where you can play a key role in shaping the future of enterprise solutions. You’ll work alongside talented teams, contributing to high-impact projects that disrupt the market. Join us if you’re passionate about project management, technology, and driving business success through innovative solutions.