Data Engineer – Data Warehouse Hybrid - US

DS Technologies Inc

Data Engineer – Data Warehouse

Full Time • Hybrid - US
About US: We are a company that provides innovative, transformative IT services and solutions. We are passionate about helping our clients achieve their goals and exceed their expectations. We strive to provide the best possible experience for our clients and employees. We are committed to continuous improvement and innovation, and we are always looking for ways to improve our services and solutions. We believe in working collaboratively with our clients and employees to achieve success.
 
DS Technologies Inc is looking for Data Engineer – Data Warehouse role for one of our premier clients.

Job Title: Data Engineer – Data Warehouse
Location: New York, NY (Hybrid – 3 days onsite at Amex Tower)
Duration: Long Term Contract
Only W2

Work Arrangement:
  • Hybrid: 3 days onsite at the American Express Tower, New York, NY.
Job Description:

We are seeking an experienced Data Engineer with strong expertise in ETL development, data warehousing, and large-scale data processing. The ideal candidate will have a solid understanding of data modeling, SQL, and Python-based ETL pipeline development. This role involves building and maintaining data pipelines that enable analytics, governance, and reporting across various enterprise systems.

Key Responsibilities:
  • Design, develop, and deploy ETL pipelines using Python from scratch.
  • Write efficient, reusable, and modular code adhering to OOP principles.
  • Design and optimize data models including schemas, entity relationships, and transformations.
  • Develop and analyze SQL queries across multiple RDBMS systems (SQL Server, DB2, Oracle).
  • Work with data warehouses such as BigQuery and Databricks Delta Lakehouse for ingestion, cleansing, governance, and reporting.
  • Create, manage, and monitor Airflow DAGs for scheduling and workflow automation.
  • Collaborate with cross-functional teams to ensure data consistency, accuracy, and accessibility.
  • Troubleshoot data issues and optimize ETL performance.
Required Skills & Experience:
  • 6+ years of experience in Data Engineering, ETL Development, and Data Warehousing.
  • Strong programming skills in Python with proven experience in ETL pipeline design.
  • Expertise in SQL, RDBMS (MS SQL Server, DB2, Oracle).
  • Hands-on experience with Airflow for orchestrating workflows.
  • Solid understanding of OOP concepts and data modeling principles.
  • Experience with BigQuery, Databricks Delta Lakehouse, or similar data warehouse technologies.
  • Familiarity with Spark is a strong plus.
Nice to Have:
  • Experience with IBM Apptio platform and cost management data workflows.
  • Knowledge of Node.js for integration scripting or data API handling.

Flexible work from home options available.

Compensation: $51.00 per hour

We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.





(if you already have a resume on Indeed)

Or apply here.

* required fields

Location
Or
Or

U.S. Equal Opportunity Employment Information (Completion is voluntary)

We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.

You are being given the opportunity to provide the following information in order to help us comply with federal and state Equal Employment Opportunity/Affirmative Action record keeping, reporting, and other legal requirements.

Completion of the form is entirely voluntary. Whatever your decision, it will not be considered in the hiring process or thereafter. Any information that you do provide will be recorded and maintained in a confidential file.