Overview
This is a remote role that may only be hired in the following locations: NC, TX, AZ, FL, GA, VA This position is responsible for the end-to-end delivery and maintenance of cloud and on-premises data solutions that include data integration (ETL/ELT), data quality, data lineage, metadata management, and data governance tools. Ensures deliverables meet the expected functionality, maintains integrity of data for business operations and decision-making and provides support for the services offered. Work with a team of developers with deep experience in on-premises and Cloud based data warehousing services.
Responsibilities
- Builds, manages, and implements the meta data driven data pipeline capabilities including data modeling, process design and overall data pipeline architecture and all phases of the ETL (extract, transform, and load) processes. Automates repeatable data preparation and integration tasks.
- Utilizes expertise to develops technical solutions to unique system problems and ensures quality results. Serves as a technical resource for management, associates, and business units.
- Partners with technology teams to understand data capture, testing needs, and to build and test end-to-end solutions. Lead small data engineering projects with manageable risks and resource requirements; plays significant roles in larger, more complex initiatives.
- Provides technical support to production systems by addressing complex issues, anticipating maintenance requirements, and ensuring functionality for end user needs.
- Collects data related to user requests and determines scope, time estimates, and implement effective technical solutions by providing problem analysis and resolution in a timely manner.
- Inspects business specifications, programming specifications, coding, test plans, documentation, and implementation plans for accuracy.
- Collaborate with architects and development teams to understand data requirements, design new tables/queries and ensure the design is feasible, implemented accurately and compliant with all the bank standards.
- Codes complex solutions to integrate, clean, transform, and control data, builds processes supporting data transformation, data structures, metadata, data quality controls, dependency, and workload management, assembles complex data sets, and communicates required information for deployment.
- Supports efforts by the data engineering team to close gaps in data management standards adherence, identifies and communicates solutions to complex problems, and leverages knowledge of information systems, techniques, and processes.
- Monitors key performance indicators and internal controls. Must be comfortable working in a distributed, multi-time zone and constantly changing environment.
- Perform unit tests and conduct reviews with other team members to make sure code is rigorously designed, elegantly coded, and effectively tuned for performance.
- Performance tuning of complex SQL queries and data pipelines.
- Create data ingestion pipelines in data warehouse and other large scale data platforms for variety of sources - File (Flat, delimited, Excel), DB, API (With Apigee integration) and SharePoint.
- Develop test plans, contributes to existing test suites including integration, regression, and performance, analyzes test reports, identifies test issues, and errors, and leads triage of underlying causes.
- Create scheduled as well as trigger-based ingestion patterns using Redwood scheduler.
- Build/manage CI/CD pipelines using Gitlab runner.
- Collaborating with Team (Technical and delivery) to deploy & validate team deliverables, PI planning, story refinement, JIRA creation and management, release planning and execution, CR creation and other release readiness activities.
- Working in Agile Framework, participating in various agile ceremonies, co-ordination with scrum master, tech lead and PO on sprint planning, backlog creation, refinement, demo, and retrospection.
- Technical support for incident tickets escalated to the team.
- Provides 24/7 on call rotations.
Qualifications
Bachelor's Degree and 4 years of experience in Software application development and maintenance OR High School Diploma or GED and 8 years of experience in Software application development and maintenance Preferred:
- 5 years of firsthand experience in data engineering/data integration SQL, Informatica, DataStage, Netezza, SSIS technologies
- 3 years of experience in agile engineering practices, leading multi-time zone data teams
- 1 years of hands-on experience in Cloud computing (AWS, Snowflake, dbt etc.,)
- Working knowledge of design and development concepts in Cloud Data warehousing, transactional and analytical data modeling, data quality, business reporting and dashboards
- Working knowledge of technologies in Cloud technologies, analytics platforms, mastering, integrations, and big data management.
- Working knowledge of Analytics Delivery, Power BI, SSAS, SSRS
Benefits are an integral part of total rewards and First Citizens Bank is committed to providing a competitive, thoughtfully designed and quality benefits program to meet the needs of our associates. More information can be found at https://jobs.firstcitizens.com/benefits.
|