About the Opportunity
- We are only considering US Citizens and Green Card holders for this position. We are unable to sponsor for this role.
- We are only considering local candidates who currently reside within 45 minutes of postal code 08611
- No Third Party Agencies
- $104 per hour 1099, $90 per hour W2 + benefits
- 1 day a week on-site in Trenton, New Jersey + Work From Home
- Job Id: CAI-680812
- Must be able to provide proof of COVID-19 vaccination plus booster shot
- Contract Term: Till end of the 2022 fiscal year with yearly extension based upon performance
Apply Now
Short Description
The State of New Jersey’s Administrative Office of the Courts (NJAOC) is seeking a Senior ETL Data Lead to develop data integration programs that load, transform, and extract data to and from the data warehouse, assure architecture and development follow best practices, and design, develop, and test ETL processes. The ideal candidate will have 5+ years of coding dimension and fact tables with an understanding of dimensional modeling and star schemas.
Required Skills/Years of Experience
- Bachelor’s Degree in a relevant technical field
- 5+ years of coding dimension and fact tables with an understanding of dimensional modeling and star schemas
- 5+ years of experience as a senior developer with cloud architecture, AWS, or IBM DataStage
- 5+ years of experience with data warehousing concepts for data quality and governance implementation
- 5+ years of designing and developing Datastage ETLs
- 5+ years of experience with AWS CLI, JSON, and/or PowerShell scripting
- 4+ years of experience with AWS cloud environment best practices
- 4+ years of experience with IBM Tools such as Information Analyzer or Watson Knowledge Catalog
- 4+ years of developing scripts and scripting languages for cloud services
- 4+ years of experience with AWS CLI, S3,Redshift, EC2, API Gateway and Lambda, and Python step functions
- 4+ years of developing design & architect using AWS and IBM foundation services VPC/VPNe
- 3+ years of migrating and managing on-premises workload to cloud service providers
Desired Skills/Years of Experience:
- 4+ years with AWS - Solutions Architect Certification
- 3+ years of Information Server Admin experience
Complete Description
The State of New Jersey’s Administrative Office of the Courts (NJAOC) is seeking a Senior ETL Data Lead to develop data integration programs that load, transform, and extract data to and from the data warehouse, assure architecture and development follow best practices, and design, develop, and test ETL processes. The ideal candidate will have 5+ years of coding dimension and fact tables with an understanding of dimensional modeling and star schemas.
This position is responsible for developing data integration programs that load, transform and extract data to/from the data warehouse, assuring that architecture and development follow data warehouse best practices. The candidate will design, develop, and test ETL processes for enterprise-wide data warehouse and data store implementations. The candidate will design, develop, and test ETL processes for enterprise-wide data warehouse implementations with DataStage and Cloud Architect. The candidate will assume overall responsibility for data migration solutions and be capable of working with AWS services like S3, EC2, VPC, Glue, Lambda, Python, JSON, and any databases such as AWS Aurora DynamoDB, RDS, database engines, and Redshift.
The Senior ETL Data Lead will be tasked to:
- Work closely with leadership and Data Governance to define and refine the Data Lake platform
- Extract, transform, and load data from various databases
- Develop and automate advanced ETL processes to gather or provide data
- Document, publish, and maintain ETL processes and related documentation
- Apply Best Practices in developing AWS Data Lake flow
- Create ETL flows to Integrate the on-premises data to Cloud AWS S3 Buckets
- Translate detailed business requirements to optimal database and ETL solutions
- Create ETL jobs to load data into AWS S3 buckets by performing cleansing
- Create a flow to load data from Amazon S3 to Redshift using Glue
- Diagnose complex problems including performance issues
- Write reusable, testable, and efficient code in Python
- Create Collibra Data Governance Center (DGC) models including assets, relationships, domains, and communities
- Configure Data Governance on business requirements and specifications
- Assist Cloud Architect with overall design of data migration solution
- Connect to extract the (meta) data from multiple source systems that comprise databases like Oracle, SQL Server, and Hadoop into a data governance platform.
- Develop DataStage Parallel Extender jobs using different stages like aggregator, join, merge, lookup, source dataset, external filter, row generator, column generator, change capture, copy, funnel, short, and peek stages
- Advise senior staff on technical issues
Hiring Expectations
- We are only considering US Citizens and Green Card holders for this position. We are unable to sponsor for this role.
- No Third Parties
- Right to Represent authorization is required
- Expect technical interview screening
- Expect F2F interview
- Background check and/or credit check will be required
About Dantech
Dantech Corporation, Inc. is a Certified Business Enterprise (CBE) in the District of Columbia and a federally recognized Woman Owned Small Business (WOSB). The company has a history of technology, innovation and transformation since its launch in 1999. As an Equal Opportunity/Affirmative Action Employer, all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, protected veteran status, or disability status. For more information about positions with Dantech, please see: https://www.dantechcorp.com/staffing.