We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Senior Data Engineer (Developer)

Creighton University
United States, Nebraska, Omaha
2500 California Plaza (Show on map)
Oct 31, 2025

The Senior Data Engineer role will be an expert in developing and building data management processes from a variety of data sources. This involves an advanced ability to actively conduct data modeling, data mapping, data pipeline construction and data integration from disparate sources and targets while utilizing the university enterprise framework of data lakes and stores. The Senior Data Engineer will also participate in requirements gathering, documenting, collecting data definitions, source to target cataloging, and defining and implementing data transformation, quality checking, cleansing, and standardization processes. Additionally, this role is responsible for testing the resulting processes for accuracy and performance.

Although this is an individual contributor role, this is a highly collaborative agile team environment requiring personal flexibility in performing a variety of roles such as analyst, designer, developer and tester,; all to take the necessary steps to ensure our partners' needs are met to the maximum extent possible in an accurate and timely manner.

Essential Functions:



  • Create and maintain data pipelines, job, etc. for extracting, transforming, and loading data from diverse sources (e.g., databases, APIs, flat files).
  • Extract, Transform, Load (ETL): ETL/IPAAS processes are the backbone of data pipelines. Extract data from various sources (e.g., APIs, databases), transform it (cleaning, aggregating, enriching), and load it into storage systems (data warehouses, databases). Ensure data flows smoothly, adhering to best practices and industry standards.
  • Monitoring: Evaluation of code and performance to see if there are opportunities for improvement.
  • Automation: Writing code (Python, SQL, or other languages) to automate data management tasks ensures consistent and reliable data flow. Schedule jobs, handle error handling, and monitor pipeline performance.
  • Data Quality: Implementing data quality checks ensures that the data is accurate, complete, and consistent. Define rules (e.g., missing values, outliers) and monitor adherence.
  • Deliver support/troubleshooting for operational data integration issues. Debug and provide solutions for performance items related to data flows
  • Test conduct unit, integration, and end-to-end tests for data quality. Data Engineers prepare test data, set up test environments, and validate all components of ETL (Extract, Transform, Load) pipelines and data flows to maintain system integrity.
  • Document Create or maintain comprehensive documentation is required for data processes, pipelines, architecture, and metadata. Within the prevailing standard that team members and stakeholders can understand the structure and logic of data systems and supports scalability and troubleshooting
  • Mapping conduct mapping of data sources to target systems. Describe how data fields correspond between systems, supporting migration, integration, and transformation efforts
  • Business Requirements work closely with stakeholders to translate business needs into technical requirements. This includes understanding domain-specific needs, designing corresponding data solutions, and ensuring pipelines and storage meet objectives
  • Modeling Collaborate with data architects and stakeholders to create/build robust data schemas, and structures with business requirements
  • As business needs require, actively being a resource for team members to answer questions and provide solutions for data integration outside their primary responsibility
  • Demonstrate active membership in internal/external organizations like the Association of Jesuit Colleges and Universities (AJCU)
  • Provide effective communication on matters relating to the job description
  • Be an active contributor to onboarding and ongoing training of team members
  • Remains current in the field of expertise to ensure assigned applications remain at the industry forefront.


Qualifications:



  • Bachelor's degree, required: A degree in Computer Science, Data Science or a related field.
  • 5+ years of combined experience of demonstrated enterprise data engineering activities


Knowledge, Skills, and Abilities:



  • Strong experience with ETL/IPAAS tools such as Mulesoft, Talend, Informatica, Boomi, Workato, etc.
  • Familiarity with DBMS; ideally Oracle, MS SQL Server.
  • Proficiency inSQL: Essential for querying databases and manipulating data.
  • Experience with Python, SSIS, Talend, or other scripting for ETL, and data analysis.
  • Familiarity withcloud platforms(e.g., AWS, Azure, GCP) for scalable data solutions.
  • Demonstrated ability to develop ETL/ELT jobs, database schemas, tables, indexes, views, queries, and understand the implementation tradeoffs of various methodologies.
  • Strong analytical and problem-solving skills: optimizing queries, handling large datasets, or troubleshooting pipeline issues.
  • Familiarity with continuous integration and continuous deployment
  • Demonstrated proficiency at utilizing APIs and flat file loads to move data from source to target
  • Demonstrated success at data quality assurance, testing and Data Tuning expertise.
  • Demonstrated ability to work independently as well as a collaborative team member with other departments.
  • Thrive in a fast-paced goal-oriented environment.
  • History of creative problem-solving while adhering to established standards
  • Coachable and open to feedback
  • Outstanding interpersonal and communication skills, along with the ability to work effectively within a diverse community.

Applied = 0

(web-675dddd98f-rz56g)