Uploaded File
add photo
Pavan
pavankasaraneni@gmail.com
979-422-4846
Bryan, TX 77808
Sr. Informatica Consultant
13 years experience W2
0
Recommendations
Average rating
134
Profile views
Summary

Overall 10 years of IT experience in the Business Analysis, Design, Data Modeling, Coding, Implementation and Testing of Applications using wide range of Technologies including Data Warehousing, Database and reporting systems for Sales, & Distribution, Retail and Finance Industries.
• Having good expertise in migrating various Informatica sessions and mappings from one version to another.
• Well versed with Informatica 8.1, 8.6, 9.1, 9.5, 9.6 64 bit.
• Experience in working with all the transformations in Informatica and applying the logic with the requirement.
• Data warehousing experience using InformaticaPowerCenter 9.x/8.x/7.x/6.2/5.0, Informatica Power Exchange 8.1 and extracting data from Db2, Oracle, MS SQL Server, MS Access, AS400.
• Extensive experience on Data Quality by creating checks to know the Quality of Data in AmeriCredit project.
• Sound knowledge on Informatica Data Qualityconceptsand IDQ tool.
• Extensive experience in using various Informatica Designer Tools like Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer and Warehouse Designer.
• Knowledge on reporting tool Business Objects.
• Experience in integration of various data sources from Databases like MS Access, Oracle, SQL Server and file formats like flat-files, CSV files, COBOL files and XML files.
• Sound knowledge of Oracle 10g/9i/8i/8.0/7.x, MS SQL Server 2000, MS Access 2000, PL/SQL, SQL*Plus, DB2 8.0.
• Worked on various database tools like Toad, SQL Developer, PL/SQL Developer, Rapid SQL developer.
• Extensively executed SQL queries on Oracle using Toad and SQL server tables in order to view successful transaction of data and to validate data.
• Good knowledge of Data warehouse concepts and principles (Ralph Kimball / Inman), Star Schema, Snowflake, SCD.
• Experience in integration of various data sources with multiple Relational Databases like Oracle, SQL Server, IBM DB2 and XML filesand worked on integrating data from flat files like fixed width and delimited.
• Used informatica NORMALIZER Transformation to convert the data from Legacy systems.
• Expertise in several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Data Quality, and lookup Tables and ETL data movement.
• Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, and Normalization/ Denormalization strategies.
• Experience in database performance tuning.
• Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
• Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
• Good knowledge in Informatica BIG data concepts and BDE tool.
• Experience in ETL testing, involved in QA testing with various teams.
• Highly Motivated to take independent responsibility as well as ability to contribute and be a productive team member
• Ability to work in multiple projects simultaneously.

Experience
Sr. Informatica Consultant
Oct 2016 - present
Woonsocket, RI
Health Sr. Informatica Consultant CVS Health is a pharmacy innovation company and ISP Project provides an innovative way to better service customers/patients/members across three "lines" of company: Retail, Specialty, and PBM. The proposed enterprise architecture creates an innovative and strategic enterprise asset CVS Health will be able to leverage both in the near term, and through the organization's longer term, and will assist in maintaining the competitive advantage in the markets where we do business. Worked in Rx-Connect performance testing batch team from OCT' 16
Informatica ISP Enterprise Architecture CVS
Remove Skill
SQL Developer, SBM teamtrack Sharepoint
Information Technology
Feb 2017 - Feb 2017
Next worked in MySchedule retail team doing work in applications RTM WFM. And also worked on the new LTC (Omnicare project) which CVS has acquired. After LTC involved in the Splunk and Guardium installation in MySchedule and LTC servers which is compliance requirement. Involved in EWS Phase II project from JULY 2018 to DEC 2019. Started MySchedule 4.1 project to support the applications migration to Vendor Cloud systems from JAN 2019 till date in RTM the hub and Retail WFM. Meanwhile also worked on DEWA integration project to feed the HR and Schedule data to DEWA team. Responsibilities:
• Create Test templates that will be used for validation testing and testing with the various tool sets which includesUnix Batches Control M
• Create the results summaries once tests are completed which contains the required information on performance parameters like CPU utilization, SBS, Runtime, stability of ENV during the test.
• Worked on various tasks including enhancements, new developments of ETL loads in the MySchedule project.
• Interface with Business Owners and project team resources on the requirement gathering. Various changes has been done to the ETL loads in WFM and RTM application based on business requirements.
• Worked on new project LTC (Omnicare) where we need to create ETL loads for the LTC application for the LTC employees forecast and demand. Involved in discussions with RXDW team which provides the source data. Got all the clarifications from the source team for the files they provide.
• Had various discussions with business team on the requirements for ETL loads which consists of HR load, Volume Drivers, Punch. Once the requirements are finalized then prepared the ADD's (Application Design Document) and got it approved from the various stakeholders.
• Once the ADD's are approved worked on the ETL development and also on the job set up through Control-M. Created batch jobs which will invoke the Informatica workflow through Control-M via pmcmd command.
• Unit testing is performed once the development is done and next the code is pushed to QA.
• Supported QA team on the QA testing. Once the QA is passed it is moved to UAT where the End to End testing happens.
• Worked in setting up the DB connection from DB2 database to ETL server by creating the required catalog entries in ETL server.
• Worked in creating purge mappings for the DB2 database tables.
• Also created Control-M FTP jobs for all ETL loads which will put the ETL load files into application server.
• Apart from this I have also taken care of the Business manual requests which needs to be tested in uat and send to prod on the same day once the business confirms the data is fine in uat.
• Also worked on the Data Masking job for LTC employees data. Used Substitution Masking for Employee names. Used Expression masking for Phone_Nbr's, Email_id's. Used Key masking for Date of Birth, SSN. Environment: Informatica PowerCenter 9.6.1, PowerExchange 9.6.1, Data Masking, SQL Developer, Teamtrack, Unix-Putty, Oracle, Tortoise SVN, Control-M 7.0, WinScp, IBM DB2, Toad for IBM DB2
DB2 ETL Informatica Informatica Powercenter Oracle SQL SQL Developer SVN TOAD Compliance Splunk Database Design IBM Guardium MS SharePoint FTP Unit Testing
Remove Skill
Sr. Informatica Developer/Lead
Information Technology
Aug 2012 - Jul 2016
TCS Sr. Informatica Developer/Lead Credit Suisse Group is a leading financial services company, deals with Private banking, Investment banking and Wealth management. Investment banking has several risks. Market Risk can be defined as the day-to-day potential for an investor/bank to experience losses from fluctuations in securities prices through RDS application. RDS is an exhaustive risk data repository for Market Risk Calculation and Reporting, where in positions and sensitivities data flows from various Front Office systems for the VaR Calculations performed by MARS (Application for VaR calculation). Data is loaded to the staging tables by InformaticaPowercenter. I have worked till DEC 2014 in RNIV (RISK NOT IN VAR) project and from JAN 2015 I am involved in LE FO sourcing project. Where we are migrating some trades from European entity to Asian entity. I have started working in FDSF project from AUG 2015 where we are new risks which are not part of VAR and used in reporting. Also started leading SRR project which are having regulatory requirements. Also worked in CCAR project which is regulatory reporting and need to introduce new CCAR risks. Responsibilities:
• Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.
• Worked on converting technical specifications into Informatica Mappings.
• Loaded data from different source systems like Flat files, Excel, Oracle, .txt files.
• Worked with Reporting team to get to an agreement on what level of complexity to be handled by ETL and reporting side.
• Worked on Informatica tuning at mapping and sessions level.
• Extensively used InformaticaPowerCenter 8.6/9.5 and created mappings using transformations like SourceQualifier, Joiner, Normalizer, Aggregator, Expression, Filter, Router, Normalizer, Lookup, Update Strategy, and Sequence Generator.
• Have used NORMALIZER Transformation heavily to convert the data in flatfile RDBMS structures.
• Used Mapplets, Parameters and Variables to facilitate the reusability of code.
• Worked on Pre and Post SQL queries on target.
• Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
• Created scripts which are reviewed by DBA's and checked in SVN tool Tortoise.
• Created unit test cases and checked them in HP lifecycle management tool which will be accessed by QA team for testing.
• Done many comparisions between source data and staging data through excel for the QA.
• Worked in Agile model projects.
• Worked on various UAT and PROD incidents raised by users and which require timely response for L3 support.
• Solved many production tickets as part of PRODUCTION L3 SUPPORT.
• Involved in Unit testing, User Acceptance testing, Stress Integration testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
• Implemented many technical JIRA's and released it to production with proper user sign off's.
• Documented the various development JIRA's I have worked on and created TDD (technical design document) incorporating the changes done.
• Responsible for Regression testing ETL jobs before test to production migration.
• Created scripts based on existing procedures according to the user requirements.
• Investigated and fixed problems encountered in the production environment on a day to day basis.
• Troubleshoot the data and performance issues raised by user community, designed, developed and deployed the code to fix them. Environment: InformaticaPowerCenter 8.6/9.5, PL/SQL Developer, JIRA, Unix-Putty, Oracle, Tortoise, HP lifecycle management tool, SVN commit tool
Agile Methodology ETL Informatica Oracle PL/SQL PL/SQL Developer SQL SVN JIRA Informatica Developer SQL Developer
Remove Skill
Informatica Consultant
Information Technology
Apr 2009 - Sep 2009
San Antonio, TX
The Patient Data Analytics Solution provides business intelligence analysis services to the billing department through interactive client tools. Data from various online transaction processing (OLTP) applications and other sources is selectively extracted, related, transformed and loaded into the Oracle Data Warehouse (ODW) using Informatica Power Center 8.1 ETL tool. Then the transformed data from data warehouse is loaded into an OLAP server to provide Business Intelligence Analysis Services Responsibilities:
• Analyzed user requirements for the system
• Installed and Configured Informatica Power Center 8.1 on Client (windows) / Server (UNIX) for Development, Test and Production Environments.
• Experience working with enterprise-wide conformed dimensions.
• Designed and Developed ETL mappings using transformation logic for extracting the data from various source systems.
• Developed mappings, mapplets by using mapping designer, and mapplet designer in Informatica Power Center 8.1 designer.
• Performed Data Extractions, Data Transformations, Data Loading, Data Conversions and Data Analysis.
• Worked heavily on processing the unstructured data.
• Developed Reusable Mapplets and Transformations for reusable business calculations
• Extracted data from different sources like Oracle, SQL Server 2005 and Flat files to load into ODW.
• Involved in collection of statistics on original query and analyze them to set up a test environment.
• Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
• Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
• Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner transformations have been used.
• Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
• Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
• Performed tuning of Informatica sessions by implementing database partitioning, increasing block size, data cache size, sequence buffer length, target based commit interval and SQL overrides
• Documented standards for Informatica code development and prepared a handbook of standards.Involved in migration of ETL code from development repository to testing repository and then to production repository.
• Handled the dead lock errors that occurred due to dependency of jobs running simultaneously.
• Provided support for daily and weekly batch loads.
• Prepared Run book for the daily batch loads giving the job dependencies and how to restart a job when it fails for ease of handling job failures during loads. Environment: InformaticaPowerCenter 8.6.1, Business Objects XI, Oracle 11i/10g, TOAD, SQL, PL/SQL, Windows XP, UNIX.
Business Intelligence Data Analysis Data Warehousing ETL Informatica Informatica Powercenter Oracle PL/SQL SQL Server TOAD Data Conversion SQL
Remove Skill
Informatica Developer
Information Technology
Sep 2008 - Jan 2009
Jefferson City, MO
Wendy's is one of the largest chains of restaurants in USA and Canada. I have worked on the Project "DWM REWRITE TO INFORMATICA PHASE2".Concentrated on the weekly sales of USA and Canada. The project mainly involves migrating the business logic from IBM Data Warehouse Manager to Informatica for the weekly sales. Responsibilities:
• Involved in developing mapping from the design documents.
• Extensively worked on the Wendy's weekly sales data from US and Canada.
• Used DB2 as the source to populate the data starting of the week. Used data warehouse Manager to load weekly data before.
• Reviewed and analyzed the existing ETL logic that was developed through earlier ETL tool "IBM Data Warehouse Center".
• Migrated existing ETL code from "IBM Data Warehouse Manager" to InformaticaPowercenter 8.1.1 by creating new mappings/sessions/Workflows.
• Created Flat files and imported to Unix Box.Checked the file using Putty.
• Used mapping designer to generate different mappings for different loads.
• Developed mappings, sessions and workflows (ETL) for SCD types I, II and III to meet the requirements
• Worked on performance tuning and optimization of the Sessions, Mappings, Sources and Target.
• Created Informatica mappings with PL/SQL procedures/Functions to build business rules to load the data.
• Created shortcuts for the necessary sources and targets in the Versioned Repository shared folders to continue the migration work of Phase1.
• Responsible for Unit Test, System Integration Test and UAT to check the data quality and documenting the test results.
• Designed necessary "Reusable" command tasks for calling appropriate "Groovy Scripts", which will manipulate the data in "Hyperion Essbase Cubes".
• After migrating the ETL logic into Informatica Code, tested the results between both "Informatica& IBM Data Warehouse Center environments" by designing the test scripts in "Mercury Quality Center".
• Migrated through various environments in Informatica 8.1 (Test, PQA and Prod).
• Responsible for writing unit test cases and performing the unit testing for data validity based on business rules.
• Designed and tested the reports in Hyperion Excel Essbase Add in to compare the reports between "Informatica& IBM Data Warehouse Center environments". Environment: InformaticaPowerCenter8.1, IBM DB2 8, IBM Data Warehouse Center, Toad, SQL, Putty, Hyperion Essbase Administration services, Mercury Quality Center 9.0. Central bank
Data Warehousing DB2 ETL Groovy Informatica Informatica Developer PL/SQL Project Management SQL TOAD UNIX HP QC Unit Testing
Remove Skill
Edit Skills
Non-cloudteam Skill
Education
Electrical and Computer Engineering
University of South Alabama 2007
Electronics and Communications
University of Madras S.A Engineering College 2004
Certifications
Brainbench Oracle PL/SQL Fundamentals
Brainbench certification in RDBMS Concepts
Skills
Informatica
2021
8
ETL
2017
4
Informatica Developer
2016
4
Oracle
2017
4
PL/SQL
2016
4
SQL
2017
4
Agile Methodology
2016
3
CVS
2021
3
Enterprise Architecture
2021
3
ISP
2021
3
JIRA
2016
3
PL/SQL Developer
2016
3
SQL Developer
2017
3
SVN
2017
3
HP QC
2009
1
Unit Testing
2017
1
Big Data
0
1
Business Analysis
0
1
Business Intelligence
2009
1
Business Objects
0
1
Compliance
2017
1
Data Analysis
2009
1
Data Conversion
2009
1
Data Modeling
0
1
Data Warehousing
2009
1
Database Design
2017
1
DB2
2017
1
ETL Developer
2012
1
FTP
2017
1
Groovy
2009
1
IBM Guardium
2017
1
IDQ
0
1
Informatica Powercenter
2017
1
MS SharePoint
2017
1
Netezza
2009
1
Performance Tuning
2012
1
Production Support
2012
1
Project Management
2009
1
SAP
0
1
Shell Scripts
2009
1
Snowflake
0
1
Splunk
2017
1
SQL Server
2009
1
Star Schema
0
1
Stored Procedure
2012
1
Test Case Preparation
2012
1
TOAD
2017
1
Triggers
0
1
UNIX
2009
1
XML
0
1