Uploaded File
add photo
Sudarvizhi
dsudarvizhi@gmail.com
804-714-9497
Nashville Elec Serv, TN 37246
Sr. ETL Consultant
14 years experience C2C
0
Recommendations
Average rating
112
Profile views
Summary

  • 9+ Years of experience in IT - Data Warehouse implementations using InformaticaPowerCenter 9.x/8.x/7.x, Oracle 11g/10g/9i, Teradata, Hadoop,Mainframes, XML and flat files.
  • Extensive experience in analysis, design, development, implementation, enhancement and support of BI applications which includes strong experience in Data Warehousing (ETL & OLAP) environment as a Data Warehouse consultant.
  • Proficiency in utilizing ETL tool Informatica Power Center 9.x/8.x/7.x for developing the Data Warehouse loads with work experience focused in data acquisition and data integration.
  • Around 2 years of comprehensive hands on experience in Big Data processing using Apache Hadoop and its ecosystem ( Pig, Hive, Sqoop and Oozie).
  • Expertise in designing confirmed and traditional ETL Architecture involving source databases Mainframe systems (COBOL files), Oracle, flat files (fixed width, delimited), DB2, SQL server, XML and target databases Oracle, SQL server, XML and flat files (fixed width, delimited).
  • Have good understanding of ETL/Informatica standards and best practices, Slowly Changing Dimensions (SCD1, SCD2, and SCD3).
  • Strong knowledge of Dimensional Modeling, Star and Snowflake schema. Designed Fact and Dimension tables as per the reporting requirements and ease of future enhancements.
  • Extensive Experience in designing and developing complex mappings applying various transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter joiner and sortertransformations.
  • Extensive experience in developing the Workflows, Worklets, Sessions, Mappings and configuring the Informatica Server using Informatica Power Center.
  • Excellent knowledge in identifying performance bottlenecks and also in tuning the mappings and sessions by implementing various techniques like partitioning techniques and pushdown optimization.
  • Experience in optimizing query performance, session performance and fine tuning the mappings for optimum performance.
  • Created reusable transformation and mapplets in the designer using transformation developer and mapplet designer.
  • Experience with pre-session and post-session SQL commands to drop indexes on the target before session runs, and then recreate them when the session completes.
  • Extensively worked for monitoring scheduled, running, completed and failed sessions. Involved in debugging the failed mappings and developing error handling methods.
  • Strong knowledge about the scheduling tool CONTROL-M.
  • Provided 24/7 active Data Warehouse production support as part of on-call rotation for both incremental and complete refresh.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Well versed in UNIX shell scripting.
  • SME for application, provide assistance to support team during issues in production.
  • Lead and provided guidance to both on-site and off-shore teams for various EDW initiatives.
  • Provided extensive Production Support for Data Warehouse for internal and external data flows to Teradata, Oracle DBMS from ETL servers via remote servers.
  • Experienced as point of contact to client for enhancement and support related activities.
  • Support for code deployment in QA and PROD environment and also responsible for code validation.
  • Responsible for production support activities like jobs monitoring, log the load statistics, analysis and resolve in case of any production issues, coordinate with business to fix any source file issues, coordinate with DBA to resolve any table space or file system related issues.
  • Responsible for deliverable out of monthly/Quarterly/Half Yearly/Yearly jobs.
  • Experience in development and documentation throughout the entire SDLC using Waterfall and Agile methodologies.
  • Excellent communication skills, business interaction skills and strong interpersonal skills to deal effectively with customer groups and clients to the top management.
  • Knowledge in ETL techniques analysis and reporting tools.
  • Having experience with working with Insurance, Banking and Telecom domain clients.
  • Excellent team player and self-starter with good ability to work independently and possess good analytical, problem solving and logical skills.

Experience
Sr. ETL Consultant
Information Technology
Mar 2016 - present
  • Cigna is a global health service company dedicated to helping people improve their health, well-being and sense of security. All products and services are provided exclusively through operating subsidiaries of Cigna Corporation, including Connecticut General Life Insurance Company, Cigna Health and Life Insurance Company.
  • Responsibilities:
  • Requirements Analysis, cost estimates, technical design creation and design reviews
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor
  • Extracted data from various centers with the data in different systems like Mainframe files and Flat files loaded the data into oracle staging using Informatica Power Center 9.6
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current and historical data in the dimension
  • Worked on BI Analytics for the data analysis
  • Created critical re-usable transformations, mapplets and worklets wherever it is necessary
  • Integrated IDQ mappings, rules as mapplets within Power Center Mappings
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL
  • Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data
  • Implemented restart strategy and error handling techniques to recover failed sessions
  • Used Unix Shell Scripts to automate pre-session and post-session processes
  • Used Active Batch scheduler to schedule and run Informatica workflows on a daily/weekly/monthly basis
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects
  • Environment: Informatica 9.6, Informatica Power Exchange, Oracle 12g, UNIX, SQL, PL/SQL, Mainframe, Teradata 14.0, Active Batch, SQL Server
SQL Business Requirements Stored Procedure SQL Server Oracle ETL Triggers Requirement Analysis IDQ Informatica Informatica Powercenter PL/SQL Data Analysis Teradata TOAD Project Management UAT
Remove Skill
Technical Lead
Information Technology
Dec 2012 - Feb 2015
  • Studied project plans and analyze various phase of work.
  • Allocate work to team member, keep track of the progress of work and provided timely report to project manager.
  • Provide technical guidance to team members.
  • To interact with the onsite Project Manager and keep them updated on the work progress.
  • Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.
  • Scheduling and automating the jobs to meet user needs.
  • Managing the adhoc work requests and tactical work requests.
  • Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Developed mappings/reusable objects/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica PowerCenter.
  • Worked extensively with different caches such as Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared).
  • Developed UNIX shell scripts to automate applications, schedule jobs and for audit purposes.
  • Worked with Informatica Data Quality IDQ toolkit, analysis, data cleansing, data matching, data conversion, exception handling, Score cards, reporting and monitoring capabilities of Informatica Data Quality IDQ.
  • Developed and modified complex SQL queries and stored procedures as per business requirements
  • Worked on performance tuning to increase the data load speed.
  • Prepared scripts to create new tables, views and queries for new enhancement in the project using TOAD.
  • Developed various mappings with the collection of all sources, targets and transformations.
  • Extracted the data from various sources like DB2, flat files and COBOL files.
  • Environments:
  • Informatica Power Center 9.1, Shell Scripting Unix,Mainframe, Oracle 11g, DB2, Informatica IDQ and Informatica Power Exchange 9.1.0
Business Requirements Auditing IDQ Informatica Data Cleansing Data Conversion Ember.JS Stored Procedure ETL Informatica Powercenter Shell Scripts UNIX XML ETL Architect Scripting
Remove Skill
Informatica Technical Lead
Information Technology
Oct 2011 - Aug 2012
  • Design and development of new data warehouse (Analytical Data Warehouse) for better reporting and analysis.
  • Worked with several vendors in sending outbound extracts and load in bound files into the warehouse.
  • Worked on Developed mappings/Reusable Objects/Transformation/Mapplet by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.6
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Used Teradata utilities FastLoad, MultiLoad to load data and building, testing, implementing Teradata FastLoad, MultiLoad and BTEQ scripts, DML and DDL.
  • Solved various defects in set of wrapper scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Expert level knowledge of complex SQL using Teradata functions, macros and stored procedures.
  • Analyzed the source data, made decisions on appropriate extraction, transformation, and loading strategies.
  • Fast Load jobs to load data from various data sources and legacy systems to Teradata Staging.
  • Modified the existing BTEQ script to enhance performance by using Volatile tables, incorporating parallelism, collect stats when needed and using the index techniques.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Provided on call support during the release of the product to low level to high level Production environment.
  • Worked with scheduling tool for jobs scheduling.
  • Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Created and Developed re-usable transformations, mappings and mapplets confirming to the business rules.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results.
  • Involved in developing mapping to populate data from various systems.
  • Environments:
  • Informatica Power Center 8.6, Shell Scripting Unix, ESP Scheduler, Mainframe and Teradata
Informatica Business Requirements Teradata Data Warehousing Informatica Powercenter ETL XML BTEQ Stored Procedure Shell Scripts UNIX SQL Scripting
Remove Skill
ETL Informatica DWH Module Lead
Information Technology
Jul 2010 - Oct 2011
  • Responsible for definition, development and testing of processes/programs necessary to extract data from operational databases, Transform and cleanse data, and Load it into data warehouse using Informatica Power center.
  • Worked on Informatica PowerCenter Tool. Extensively worked with SQL, Expression, Filter, Lookup, Normalizer etc.
  • Worked on the change data capture (CDC) process for extract the data.
  • Developed various mappings with the collection of all sources, targets and transformations.
  • Worked with Business users in consolidating requirements and data mapping of reporting requirements.
  • Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance.
  • Made use of Post-Session success and Post-Session failure commands in the session task to execute scripts needed for clean up and update purposes.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.
  • Requirements gathering and analysis, involved to creating High-Level design documents, Application Design documents.
  • Use various transformations like Source Qualifier, Filter, Aggregator, Expression, Lookup, Sequence Generator, Joiner, Union, Router, Sorter and Update strategy to create mapping.
  • Using ETL Process to Extract, Transform and Load the data into stage area and data warehouse.
  • Creating reusable transformations, mapping, mapplet, various tasks like session, Command, Email, Timer, Control, Event Wait, Event raise, Assignments, Worklets and workflows.
  • Coordinating the team and resolving the issues of the team technically as well as functionally.
  • Scheduling the jobs and monitoring its daily run in Production.
  • Environments:
  • Informatica 8.6, Unix, Oracle 10g, HP Quality Center, SFTP and Super Putty
SQL Informatica Data Warehousing Oracle Informatica Powercenter Data Mapping ETL UNIX HP QC PuTTY Requirements Gathering
Remove Skill
Senior Developer
Information Technology
Feb 2007 - Feb 2010
  • Analyzing user requirements, designing and documenting functional specification and validation documents.
  • Designing ETL mapping as per functional specification document.
  • Developing ETLs using Informatica mapping, UNIX Shell and providing alternative solutions to common problems. Resolving various technical issues with ETL mapping.
  • Developing data mart mappings for reporting purpose.
  • Performance optimization on Teradata SQL Queries, designed/created several BTEQ/MLOAD/FLOAD scripts with data transformations for loading the base tables to apply the business rules manipulate and/or massage the data according to the requirements.
  • Developed several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Mapplets and Parameter files in Mapping Designer using Informatica PowerCenter.
  • Performing extensive testing of ETL mapping with various test scenarios.
  • Environments: Informatica PowerCenter 8.1/8.6, Oracle10 g, UNIX, Teradata
SQL Data Marts Oracle ETL BTEQ Informatica Informatica Powercenter Teradata UNIX
Remove Skill
Edit Skills
Non-cloudteam Skill
Education
Computer Science
not provided
Periyar University
Skills
ETL
2021
8
Informatica
2021
8
Informatica Powercenter
2021
8
UNIX
2015
7
SQL
2021
6
Oracle
2021
5
Teradata
2021
5
Scripting
2015
4
BTEQ
2012
3
Business Requirements
2021
3
Data Marts
2010
3
ETL Architect
2015
3
IDQ
2021
3
Project Management
2021
3
Shell Scripts
2015
3
Stored Procedure
2021
3
UAT
2021
3
XML
2015
3
Auditing
2015
2
Data Cleansing
2015
2
Data Conversion
2015
2
Data Warehousing
2012
2
Ember.JS
2015
2
TOAD
2021
2
Data Mapping
2011
1
HP QC
2011
1
PuTTY
2011
1
Requirements Gathering
2011
1
Agile Methodology
0
1
Big Data
0
1
BMC Control-M
0
1
Data Analysis
2021
1
Data Integration
0
1
DB2
0
1
Hadoop
0
1
Hive
0
1
Oozie
0
1
Pig
0
1
PL/SQL
2021
1
Production Support
0
1
Quality Assurance
0
1
Requirement Analysis
2021
1
SDLC
0
1
Snowflake
0
1
SQL Server
2021
1
Sqoop
0
1
Triggers
2021
1
Waterfall
0
1