About the Company
A leading IT solutions organization, servicing customers in North America, Europe, Asia and Australia. It offers services in Application Development and Maintenance, Enterprise Solutions including Managed Services and Business Process Outsourcing to organisations in the Financial Services, Travel & Transportation, Manufacturing/Distribution, and Government sectors. With employees over 10,000 professionals and follows industry standards of software development processes.
More information on Request.
Company – NIIT Technologies Ltd.
Work Location - Bangalore
Role : SQL developer with Data Warehouse / ETL Developer – JD
Experience : 6+ Years
Data Warehouse / ETL Lead Developer (Pentaho/SQL Server/SSIS)
A highly experienced Data Warehouse Developer with proficient knowledge in ETL Design and Development using Pentaho Data Integrator and SQL Server technology stack. The candidate must have knowledge in Data Warehouse concepts, dimensional modelling and ETL architecture. Candidate should be disciplined, detail-oriented, self-motivated, and delivery-focused.
· In-depth knowledge and hands-on experience with Data Extraction, data integration, Transformation and Data Migration.
· Experience in Data Profiling, Data conformance, Data Quality, Data Governance and Data Lineage.
· Expert in Pentaho Data Integrator version 7.6 / 8.1 and SQL Server 2012/2014/2016
· Designing and developing ETL processes for SQL Server based Data Warehouse.
· Works with BI Architect and Technical Lead to understand solution vision and create data integration design.
· Ensures source system data availability and update accessibility, data integrity, restart ability, and error handling
· Follows industry standards and best practices.
· Strong knowledge in using Pentaho enterprise version and Metadata data based injection methods.
· Plans and conducts ETL unit and development tests; monitors results and takes corrective action
· Defines and implements data quality logic associated with data processing flows
· Designs, develops, automates, and supports complex applications to extract, transform, and load data from multiple sources into accessible and performant structures that support analytic needs
· Performance tuning of ETL processes
· Providing production support
· Strong Communication skills
· Must work as a team to focus on Delivery
· Must have experience with On Site and Off Shore Development model.
Skills Desired but not mandatory:
· Big Data technologies – Hive, Map Reduce
· Python 2.7 / 3.7
· Java Script
Current CTC :
Expectation CTC :