Visa : US citizens, Greencard holders preferred
Skills : AWS, Strong SQL. Performance tuning. Developer, ETL tools development., Tableau, Congno, PowerBI, Vizulize, DynamoDB, Serverless Aurora SQL
Experience level: Mid-senior
Remote job – Anywhere in US (Local to Minnesota preferred)
The Senior Data Engineer is a member of a new Data Science and Engineering team. This role will require expertise in the design, creation, management, and business use of large datasets, across a variety of data platforms, both internal and external.
The Senior Data Engineer will work with other Software Engineers and Architects responsible for different data assets to understand data requirements, and to build ETL to ingest the data into a (Healthcare communication Company) Go data lake. They should be an authority at crafting, implementing, and operating stable, scalable, low-cost solutions to flow data from production systems into the data lake.
Above all, they should be passionate about working with very large-scale data sets and someone who loves to bring datasets together to answer business questions and drive development and growth.
Essential Duties and Responsibilities:
- Work with multiple (Healthcare communication Company) Engineering and DevOps teams to develop, optimize and maintain the Data Science and Engineering data assets.
- Design, construct and test data models, including large-scale data repositories optimized for reporting and data as a service that may influence or drive architectural changes.
- Coordinate activities with data source application owners to ensure optimum integration, data integrity, and data quality
- Ensure production service levels, performance quality, and resolution of data load failures.
- Translate business requirements into ETL designs and mapping specifications.
- Coordinate with the (Healthcare communication Company) Go security team to ensure all data asset security requirements are met.
- Collaborate with (Healthcare communication Company) Go Engineering teams to plan new features, as needed.
- Participate in planning and scoping meetings for future projects.
- Mentor and train fellow team members on technologies, design patterns, and best practices.
- Keep abreast of industry trends and present findings to team, leadership, and stakeholders.
- Research and resolves issues in a timely manner, identifying root cause and implementing sound technical resolutions.
Worksite Location: Home Based Office
Education and/or Experience:
Bachelor’s degree required.
Required Skills and Qualifications:
- AWS ecosystem familiarity (Lambdas, DynamoDB, Serverless Aurora SQL, SNS.)
- Deep business intelligence platform experience (Tableau, Cognos, Spotfire, etc.)
- Strong SQL skills, including performance tuning in a SQL environment.
- Development of ETL tools/processes to load data repositories and create data stores.
- Working with large-scale databases, collection, and organization of real-time event streaming data.
- Working with Dimensional, Entity-Relationship, Tabular models, and OLAP data modeling.
- A proven track record in delivering in an agile environment, while managing multiple priorities.
- Practical experience with Continuous Integration/Continuous Deployment (CI/CD).
- Experience with Git/Github or comparable distributed version control system.
- Experience working with Jenkins.
To apply for this job send me your CV –
Please include your email id along with cover letter as I have to submit it to the client. Once you applied, I will be sending the screening questions. Your answers would help me submit it to the site/client quickly.
To apply for this job email your details to email@example.com