Careers

Data Engineer

Years of Experience: 
3-3 Years
Skills Stack: 
Azure, AWS, and Snowflake
Shift Timings: 
05:30 am - 05:30 am ((GMT-04:00) Eastern Time (US and Canada))
Role and Responsibility Details: 
  • Creating an ETL Process and implementing all the business rules in the system and Writing Unit test cases, preparing files for business requirement and validating it.
  • Utilizing ETL and ELT technologies including Azure Data Factory, Databricks, DBT, SSIS, and Fivetran to efficiently load structured and semi-structured data from multiple systems into data marts and warehouses.
  • Designing multidimensional data warehouses (OLAP) using Oracle Hyperion Solutions and SQL Server Analysis Services to facilitate comprehensive data analysis.
  • Implementing end-to-end architecture using Microsoft Azure solutions: Azure Data Lake, Azure Databricks, Data Factory, and Synapse Data Warehouse for better organizational data solutions for reporting purposes.
  • Developing solutions in Business Intelligence and visualization tools including Tableau, Power BI, SAP Business Objects, SSRS, Cognos, SAP Lumira, and MicroStrategy.
  • Migrating legacy and relational data systems to SaaS-based warehouses including Snowflake and Synapse solutions for serverless processing and data analytics capabilities.
  • Utilizing Snowflake features including Data sharing, SnowSQL, SnowPipe, Tasks, virtual warehouse sizing, Zero copy clone, Time travel, Procedures, and Functions for building the robust Data marts and warehouses.
  • Leading and managing onshore and offshore teams to build end-to-end solutions and automating test patterns and models to accelerate development processes.
  • Creating high-level business requirement documents and functional requirement documents, and maintaining them using Jira, TFS, and GitHub versioning tools for code and document management. Working in an Agile scrum environment for faster development and future sprint planning.
  • Analyzing the defects and resolving/clarifying the defect raised by customers logged using Incident Tracking system (ITS) as tickets.
  • Creating stored procedures and views to process the report requirements for Tableau dashboard, Jobs and Alerts for the ETL process and report extracts.
  • Creating an ETL Process and implementing all the business rules in the system and Writing Unit test cases, preparing files for business requirement and validating it.
Job Description: 
  • Design and develop data warehouse solutions using Azure, AWS, and Snowflake technologies tailored for financial and budget planning goals.
  • Utilize ETL and ELT technologies including Azure Data Factory, Databricks, DBT, SSIS, and Fivetran to efficiently load structured and semi-structured data from multiple systems into data marts and warehouses.
  • Design multidimensional data warehouses (OLAP) using Oracle Hyperion Solutions and SQL Server Analysis Services to facilitate comprehensive data analysis.
  • Implement end-to-end architecture using Microsoft Azure solutions: Azure Data Lake, Azure Databricks, Data Factory, and Synapse Data Warehouse for better organizational data solutions for reporting purposes.
  • Develop solutions in BI and visualization tools including Tableau, Power BI, SAP Business Objects, SSRS, Cognos, SAP Lumira, and MicroStrategy.
  • Migrate legacy and relational data systems to SaaS-based warehouses including Snowflake and Synapse solutions for serverless processing and data analytics capabilities.
  • Utilize Snowflake features including Data sharing, SnowSQL, SnowPipe, Tasks, virtual warehouse sizing, Zero copy clone, Time travel, Procedures, and Functions for building the robust Data marts and warehouses.
  • Develop custom solutions using C#.NET, Python, SQL, JavaScript, Scala, and DAX to address specific business intelligence requirements.
  • Lead and manage onshore and offshore teams to build end-to-end solutions and automate test patterns and models to accelerate development processes.
  • Create high-level business requirement documents and functional requirement documents, and maintain them using Jira, TFS, and GitHub versioning tools for code and document management.
  • Lead discussions to extract and articulate business requirements and corresponding solutions to drive business development initiatives.
  • Work in an Agile scrum environment for faster development and future sprint planning.
  • Use Big Data technologies including Hadoop, Storm, Kafka, NoSQL, and Graph databases to stay current with emerging trends and technologies.
Qualifications: 
# Bachelor of Engineering
Locations: 
New Jersey - US
Contact: 
Recruitment Team
Email: tdg-recruitment@thedigitalgroup.com