Sr Informatica Developer
Insurance
Dec 2016 - present
Columbus, OH
- Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
- Worked on IDQ 10.1 version to perform the data quality checks according to the business requirements.
- Identified and eliminated duplicates in datasets through Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
- Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
- Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 10.0.
- Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
- Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
- Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
- Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
- Design the Source - Target mappings and involved in designing the Selection Criteria document.
- Wrote BTEQ scripts to transform data. Used Teradata utilities fastload, multiload, tpump to load data
- Responsible for manually start and monitor production jobs based on the business users requests.
- Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Responsible to look into production issues and resolved them in timely manner.
- Developed Informatica process to replace stored procedure functionalities and provide a time effective and high data quality application to the client.
- Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
- Prepared ETL Specifications and design documents to help develop mappings.
- Created Mappings for Historical and Incremental loads.
- in and checkout versions of objects.
- Worked on staging the data into work tables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
- Worked with PMCMD to interact with Informatica Server from command mode and execute the shells scripts.
- Project based on Agile SDLC methodology with 2 weeks of software product release to the business users.
- Take part in daily standup and scrum meetings to discuss the project lifecycle, progress and plan accordingly, which is the crux of Agile SDLC.
- Provide post release/production support.
Environment: Informatica Power Center 10.0, IDQ, IDE, Oracle Database 11g, SQL server, Toad for Oracle, Unix Shell scripts, Teradata.
ETL | Informatica Developer
Insurance
Mar 2015 - Nov 2016
Birmingham, AL
- Used Informatica Power Center v9.6 for extraction, transformation and load (ETL) of data in the data warehouse.
- Productively used Informatica tools - Informatica Repository Manager, Informatica Power Center Designer, Informatica Workflow Manager and Informatica Workflow Monitor.
- Using the concept of Slowly Changing Dimensions, complex mappings were created. Involved implementation of Business logic and capturing the deleted rows in the source origination.
- Worked extensively with the connected lookup transformations with the dynamic cache enabled.
- Developed several reusable transformations and Mapplets, which were used in other mappings.
- Created sessions and extracted data from various sources.
- Transformed the data according to the requirement and loading into the data warehouse.
- Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in InformaticaPowerCenter/Data Quality (IDQ), peer reviewed their
- Involved in creating various mappings in the Informatica power center designer.
- Used Informatica Power Center for extractions, transformation, and loading data from heterogeneous sources into the target databases.
- Worked extensively on different complex mappings using transformations like Source qualifier, expression, Router, filter, update strategy, Connected and Un-connected lookup, Normalizer, joiner, Aggregator, Update strategy.
- Extracted Data from different Source Systems like Oracle, Teradata and Flat files.
- Experienced in optimizing the SQL queries to improve the performance.
- Worked on Informatica Data Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.
- Experience on Data Analysis for source and target systems and good understanding of Data Warehousing concepts, Dimensions, Facts and Star, Snowflake Schemas and ER modeling.
- Responsible for Unit testing, System and Integration testing.
- Worked with team to convert Trillium process into Informatica IDQ objects.
- Extensively Used debugger in identifying bugs in existing mappings by analyzing data
- Identified and fixed performance bottlenecks and tuned the Informatica mappings for better Performance.
- Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
- Analyzed Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
Environment: Informatica Power 9.6 (Workflow Manager, Workflow Monitor, Mapplets), Oracle, IDQ, Teradata, SQL.
Sr Informatica Developer
Banking/Financial
Nov 2013 - Feb 2015
Atlanta, GA
- Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
- Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
- Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
- Create workflows and work lets for Informatica Mappings.
- Work on SQL coding for overriding for generated SQL query in Informatica.
- Involve in Unit testing for the validity of the data from different data sources.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
- Involve in migrating the ETL application from development environment to testing environment.
- Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
- Worked with Informatica toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities.
- Perform Data Conversion/Data migration using Informatica PowerCenter.
- Involve in performance tuning for better data migration process.
- Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
- Create UNIX shell scripts for Informatica pre/post session operations.
- Automated the jobs using CA7 Scheduler.
- Worked on Direct Connect process to transfer the files between servers.
- Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
- Worked with XML targets for the data coming from SQL server source.
- Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier than the given date i.e. the change data capture (CDC).
- Parameterized the whole process by using the parameter file for the variables.
- Imported xsd file to create the xml target and create the Hierarchical Relationship
- And normalized views.
- Implemented the logic by using HTTP transformation to query the web server.
- Configure and setup a secure FTP connection to the vendor using the Informatica Managed File transfer software.
- Created complex Shell scripts for various set of actions that would automate the process of executing the actions like validating the presence of indicator files.
- Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
- Involved in Unit testing and system integration testing (SIT) of the projects.
- Assist the team members with the mappings developed as part of knowledge transfer.
Environment: Informatica PowerCenter8.6.1/ 8.1.1, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files,),Tidal 5.3.1. UNIX.
Responsibilities:
• Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
• Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator Router, Expression, Joiner, Union, Normalizer and Sequence generator.
• Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
• Create workflows and work lets for Informatica Mappings.
• Work on SQL coding for overriding for generated SQL query in Informatica.
• Involve in Unit testing for the validity of the data from different data sources.
• Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
• Involve in migrating the ETL application from development environment to testing environment.
• Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
• Worked with Informatica toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities.
• Perform Data Conversion/Data migration using Informatica PowerCenter.
• Involve in performance tuning for better data migration process.
• Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
• Create UNIX shell scripts for Informatica pre/post session operations.
• Automated the jobs using CA7 Scheduler.
• Worked on Direct Connect process to transfer the files between servers.
• Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
• Worked with XML targets for the data coming from SQL server source.
• Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier than the given date i.e. the change data capture (CDC).
• Parameterized the whole process by using the parameter file for the variables.
• Imported xsd file to create the xml target and create the Hierarchical Relationship
• And normalized views.
• Implemented the logic by using HTTP transformation to query the web server.
• Configure and setup a secure FTP connection to the vendor using the Informatica Managed File transfer software.
• Created complex Shell scripts for various set of actions that would automate the process of executing the actions like validating the presence of indicator files.
• Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
• Involved in Unit testing and system integration testing (SIT) of the projects.
• Assist the team members with the mappings developed as part of knowledge transfer. Environment: Informatica PowerCenter8.6.1/ 8.1.1, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files, ), Tidal 5.3.1. UNIX. Industry: Volkswagen of America, (Chennai, India) (Auburn Hills, MI) Role: ETL Developer/Analyst
Responsibilities:
• Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
• Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator Router, Expression, Joiner, Union, Normalizer and Sequence generator.
• Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
• Create workflows and work lets for Informatica Mappings.
• Work on SQL coding for overriding for generated SQL query in Informatica.
• Involve in Unit testing for the validity of the data from different data sources.
• Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
• Involve in migrating the ETL application from development environment to testing environment.
• Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
• Worked with Informatica toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities.
• Perform Data Conversion/Data migration using Informatica PowerCenter.
• Involve in performance tuning for better data migration process.
• Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
• Create UNIX shell scripts for Informatica pre/post session operations.
• Automated the jobs using CA7 Scheduler.
• Worked on Direct Connect process to transfer the files between servers.
• Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
• Worked with XML targets for the data coming from SQL server source.
• Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier than the given date i.e. the change data capture (CDC).
• Parameterized the whole process by using the parameter file for the variables.
• Imported xsd file to create the xml target and create the Hierarchical Relationship
• And normalized views.
• Implemented the logic by using HTTP transformation to query the web server.
• Configure and setup a secure FTP connection to the vendor using the Informatica Managed File transfer software.
• Created complex Shell scripts for various set of actions that would automate the process of executing the actions like validating the presence of indicator files.
• Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
• Involved in Unit testing and system integration testing (SIT) of the projects.
• Assist the team members with the mappings developed as part of knowledge transfer. Environment: Informatica PowerCenter8.6.1/ 8.1.1, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files, ), Tidal 5.3.1. UNIX. Industry: Volkswagen of America, (Chennai, India) (Auburn Hills, MI) Role: ETL Developer/Analyst
Responsibilities: Involved in business analysis and technical design sessions with business and technical staff to develop Requirements document and ETL specifications. Involved in designing dimensional modeling and data modeling using Erwin tool.
• Created high-level
• Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
• Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
• Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
• Prepared user requirement documentation for mapping and additional functionality.
• Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
• Analyzed current system and programs and prepared gap analysis documents
• Experience in Performance tuning & Optimization of SQL statements using SQL trace
• Involved in Unit, System integration, User Acceptance Testing of Mapping.
• Supported the process steps under development, test and production environment Environment: Informatica Power Center 8.1.4/7.1.4, Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case Industry: AAA Solutions, Hyderabad, India Role: ETL Developer
Responsibilities: Involved in business analysis and technical design sessions with business and technical staff to develop Requirements document and ETL specifications. Involved in designing dimensional modeling and data modeling using Erwin tool.
• Created high-level
• Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
• Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
• Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
• Prepared user requirement documentation for mapping and additional functionality.
• Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
• Analyzed current system and programs and prepared gap analysis documents
• Experience in Performance tuning & Optimization of SQL statements using SQL trace
• Involved in Unit, System integration, User Acceptance Testing of Mapping.
• Supported the process steps under development, test and production environment Environment: Informatica Power Center 8.1.4/7.1.4, Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case Industry: AAA Solutions, Hyderabad, India Role: ETL Developer
Responsibilities: Involved in business analysis and technical design sessions with business and technical staff to develop Requirements document and ETL specifications. Involved in designing dimensional modeling and data modeling using Erwin tool.
• Created high-level
• Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
• Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
• Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
• Prepared user requirement documentation for mapping and additional functionality.
• Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
• Analyzed current system and programs and prepared gap analysis documents
• Experience in Performance tuning & Optimization of SQL statements using SQL trace
• Involved in Unit, System integration, User Acceptance Testing of Mapping.
• Supported the process steps under development, test and production environment Environment: Informatica Power Center 8.1.4/7.1.4, Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case Industry: AAA Solutions, Hyderabad, India Role: ETL Developer
Responsibilities:
• Development Implementation & Training of users.
• Prepared ETL Specifications and design documents to help develop mappings.
• Created mappings for Historical and Incremental Loads.
• Used Version Control to check in and checkout versions of objects.
• Tuned Source, Target, Mappings, Transformations and Sessions for better performance.
• Supporting daily loads and work with business users to handle rejected data.
• Prepared and maintained mapping specification documentation.
• Use Debugger to test the mapping and fixed the bugs.
• Used Mapping Variables and Mapping Parameters to fulfill the business requirements.
• Implemented Type I and Type II slowly changing Dimension to maintain all historical information in Dimension Tables.
• Involved to prepare technical and functional specification.
• Performance analysis and tuning of SQL statements of projects.
• Imported data from different sources to Oracle tables using Oracle SQL*Loader
• Wrote Oracle PL/SQL, stored procedures and triggers to populate data.
• Involved in writing complex SQL statements for reports.
• Worked closely with client to research and resolve user testing issues and bugs. Environment: Informatica PowerCenter, ETL, UNIX, PL/SQL, TOAD, Oracle 8i, SQL, SQL*Loader
Responsibilities:
• Development Implementation & Training of users.
• Prepared ETL Specifications and design documents to help develop mappings.
• Created mappings for Historical and Incremental Loads.
• Used Version Control to check in and checkout versions of objects.
• Tuned Source, Target, Mappings, Transformations and Sessions for better performance.
• Supporting daily loads and work with business users to handle rejected data.
• Prepared and maintained mapping specification documentation.
• Use Debugger to test the mapping and fixed the bugs.
• Used Mapping Variables and Mapping Parameters to fulfill the business requirements.
• Implemented Type I and Type II slowly changing Dimension to maintain all historical information in Dimension Tables.
• Involved to prepare technical and functional specification.
• Performance analysis and tuning of SQL statements of projects.
• Imported data from different sources to Oracle tables using Oracle SQL*Loader
• Wrote Oracle PL/SQL, stored procedures and triggers to populate data.
• Involved in writing complex SQL statements for reports.
• Worked closely with client to research and resolve user testing issues and bugs. Environment: Informatica PowerCenter, ETL, UNIX, PL/SQL, TOAD, Oracle 8i, SQL, SQL*Loader
Responsibilities:
• Development Implementation & Training of users.
• Prepared ETL Specifications and design documents to help develop mappings.
• Created mappings for Historical and Incremental Loads.
• Used Version Control to check in and checkout versions of objects.
• Tuned Source, Target, Mappings, Transformations and Sessions for better performance.
• Supporting daily loads and work with business users to handle rejected data.
• Prepared and maintained mapping specification documentation.
• Use Debugger to test the mapping and fixed the bugs.
• Used Mapping Variables and Mapping Parameters to fulfill the business requirements.
• Implemented Type I and Type II slowly changing Dimension to maintain all historical information in Dimension Tables.
• Involved to prepare technical and functional specification.
• Performance analysis and tuning of SQL statements of projects.
• Imported data from different sources to Oracle tables using Oracle SQL*Loader
• Wrote Oracle PL/SQL, stored procedures and triggers to populate data.
• Involved in writing complex SQL statements for reports.
• Worked closely with client to research and resolve user testing issues and bugs. Environment: Informatica PowerCenter, ETL, UNIX, PL/SQL, TOAD, Oracle 8i, SQL, SQL*Loader