- Developed ETL programs using Informatica to implement the business requirements.
- Perform data mapping of source-to-target data sets.
- Loaded the aggregate data into a relational database for reporting, dash boarding and ad-hoc analysis.
- Overseeing the inbound and outbound interfaces development process by closely working with functional, developer and tester.
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Worked on Informatica Power Canter tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Worked with data scientist to create table in Hive to run data models.
- Conversion EDW to Big data.
- Implemented Pass Through, Auto Hash, User defined Hash Key and Data Base Partitions for performance tuning.
- Analyzed Bulk Load option, third party Loaders suggested by Informatica.
- Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data.
- Extracted data from legacy systems into staging area using ETL jobs & SQL queries.
- Developed the SQL scripts and Procedures for the business rules using Unix Shell and NZSQL for Netezza.
- Assessed the Netezza environment for implementation of the ELT solutions.
- Structural harmonization is done by extracting the data from different staging tables into alignment area table by integrating multiple source tables from staging into alignment schema table.
- Performed Requirement Analysis, Designing and Creating Data Services Layer in DENODO Express and feeding it to downstream Dashboard systems.
- Created custom DENODO views by joining tables from multiple data sources.
- Designed and developed high-quality integration solutions by using DENODO virtualization tool (read data from multiple sources including Oracle, Hadoop, MySQL)
- Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameters using Tableau.
- Strong Dashboard design experience and passionate practitioner of effective data visualization. Familiarity with best practices around visualization and design.
- Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
- Experience in Agile Methodology project execution model and expertise in coordinating the teams between multi locations.
- Created Jira ticket, code check in to SVN branch using Tortoise.
Environment: Hadoop, Python, Spark, Hive, Informatica Power Center 10.2/10.1, Oracle 11g, Netezza, NoSQL, NZSQL, DENODO, UNIX Shell Scripting, SQL, PL/SQL, TOAD, MS Access, MS Visio, Utilities BTEQ, FLOAD and TPUMP, Putty, WinScp, WinCvs, Linux, Tableau, QlikView, Tortoise, Ultra Edit