Uploaded File
add photo
Anil
anilreddy2606@gmail.com
908-547-0565
Hartford, CT 06199
Software Developer
11 years experience W2
0
Recommendations
Average rating
36
Profile views
Summary

Over all 7+ years of experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) through Informatica Power Center 9.5/9.1/8.6/8.1 & Data Quality and Talend Open Studio from various sources.
• Experience working with Data Warehousing Concepts like OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/ Dimensional Data Modeling.
• Extensively used ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Informatica Power Center and designed data conversions from wide variety of source systems including Netezza, Oracle, SQL server, Teradata and non- relational sources like flat files, XML and SFDC.
• Strong experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Normalizer, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy.
• Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads
• Good experience in working with UNIX Shell Scripts for automatically running sessions, aborting sessions and creating parameter files using control-m scheduler.
• Experience in monitoring and scheduling using Control M& Job Conductor (Talend Admin Console) and using UNIX (Korn& Bourn Shell) Scripting.
• Good experience in writing number of Shell scripts to run various batch jobs.
• Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Informatica ETL Components, Teradata Utilities, UNIX Scripts, SQL Scripts etc.
• Proven track record in troubleshooting Informatica Sessions and addressing production issues like performance tuning and enhancement.
• Excellent knowledge on Informatica Data Quality Transformations such as: Labeler, Parser, Address Validator, Standardizer, etc., along with general transformations like Lookup, Expression, Filter, Router, Normalizer Etc.
• Involved in import /export of existing mappings into different environments including Development, QA and Production.
• Experience on post-session and pre-session Shell Scripts for tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated date etc.
• Intensively worked for client ETL code migration from Informatica to Talend Studio.
• 6 months of experience using Talend Open Studio (6.3.1) and 6 months of experience with Talend Admin Console (TAC).
• Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
• Responsible for Unit testing and Integration testing of mappings and workflows.
• Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De normalization Concepts.
• Detail oriented with good problem solving, organizational, analysis and requirement gathering skills.
• Articulate with excellent communication and interpersonal skills with the ability to work in a team as well as individually

Experience
Software Developer
Information Technology
Jun 2018 - Jun 2018
Woodbridge, NJ
- XXXXXXX Prudential Financial is a Fortune Global 500 and Fortune 500 company whose subsidiaries provide insurance, investment management, and other financial products and services to both retail and institutional customers throughout the United States and in over 30 other countries. Principal products and services provided include life insurance, annuities, mutual funds, pension- and retirement-related investments, administration and asset management, securities brokerage services, and commercial and residential real estate in many states of the U.S. Responsibilities:
• Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
• Interacted actively with Business Analysts, SME and Data Modelers on Mapping documents and high-level design documents for various Sources and Targets.
• Strong understanding of Annuities business flow and created an ETL layer between Source Systems, Staging area and Target systems.
• Extracted/Load Contracts, Party, Agreements, Product and Funds, Transactions and Actuarial modules are involved and for each subject area having different source systems like VPAS, FAST, LifeCAD and SE2.
• Loaded and extracted diverse types (Structured, Semi Structured, JSON, delimited flat files, XML etc.) data from Oracle, DB2, MongoDB, Salesforce and AWS S3.
• Created Informatica mapplets/mappings/tasks/worklets/workflows using PowerCenter
• Made use of various PowerCenter Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Normalizer, Rank, Router, Sequence generator, Union and Update Strategy transformations while creating mapplets/mappings.
• Experienced Working on different tasks in Workflow Manager like Sessions, Event wait, Decision, E-mail, Command, Assignment, and Scheduling of the workflow.
• Implementing Slowly Changing Dimension (SCD type II) design for the Data Warehouse.
• Utilized Informatica PowerExchange for Salesforce to load data into Salesforce Objects.
• Created mapping to load data into AWS S3 bucket using Informatica S3 connector also populated data into oracle from S3 bucket.
• Performed Data Quality Checks, Cleansed the incoming data feeds and profiled the source systems data as per business rules using IDQ.
• Designed IDQ mapplets/rules by using numerous transformations like Standardizer, Parser, Address validator, Joiner, Expression, Case converter, Router, Comparison.
• Designed/Developed Data validation and reconciliation, Balance and Audit Control system within the ETL pipeline to ensure that data can be audited after it's loaded and that errors can be debugged.
• Developed Batch processing using DB2 to automate the process of loading, pushing and pulling data from different admin systems and to generate parameter file for ETL and to keep the batch history.
• Create Unix shell script to validate Header, Trailer and detail records before processing the flat file for informatica.
• Develop reusable UNIX shell scripts to run the pmcmd functionality to start and stop sessions.
• Create Unix shell script to create parameter file, Renaming the files and creating list files.
• Creating and scheduling jobs in Autosys Workload Scheduler to run Informatica PowerCenter Workflows and shell scripts.
• Create JIL (Job Information Language) script to setup complex inter dependencies, event dependencies, time schedules alerts.
• Create Autosys schedule calendar specific to each source system.
• Created validation rules in Informatica Data Validation Option tool to perform Source to Target Data, Data Quality, Performance, Data Integration, Application Migration testing and Data Constraint Check, Duplicate Data Check.
• Set up table pairs and different type of test rules in Data Validation Option to Compare Individual Records between Sources and Targets, Constraint based data validation, Expression based tests, Aggregate Functions tests, Join View, Validity of the lookup logic, SQL View. Environment: Informatica Power Center 11, Informatica Developer, Informatica DVO, Oracle, Flat Files, Putty, XML, UNIX, Salesforce, Json, AWS and Autosys.
No skills were added
Remove Skill
Software Developer
Information Technology
Feb 2016 - Jun 2018
Farmington, CT
This project aimed at building a reusable code for all the PRM, CRM businesses and applying it for the new sources getting added into the Evariant Data warehouse and providing information to customers and the third-party vendors for benefit analysis. Project also involved enhancements to the ETL code for the existing Evariant sources. Responsibilities:
• Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process using Informatica.
• Created / updated ETL design documents for all the Informatica components changed.
• Extracted data from heterogeneous sources like oracle, xml, flat file and perform the data validation and cleansing in staging area then loaded in to data warehouse in oracle 11g.
• Made use of various Informatica source definitions viz. Flat files and Relational sources.
• Made use of various Informatica target definitions viz. relational data base targets.
• Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows using Power Center to load the data from source to stage, stage to persistent, stage to reject and stage to core.
• Made use of various PowerCenter Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Rank, Router, Sequence generator, Union and Update Strategy transformations while creating mapplets/mappings.
• Deep understanding of core data quality design patterns and the associated challenges involved with data analysis, certification, modelling, quality improvement.
• Integrated Informatica Data Quality IDQ with Informatica PowerCenter and Created various data quality mappings in Informatica Data Quality tool and imported them into Informatica powercenter as mappings mapplets.
• Excellent knowledge on Informatica Data Quality Transformations such as: Labeler, Parser, Address Validator, Standardizer, etc., along with general transformations like Lookup, Expression, Filter, Router, Normalizer Etc.
• Experience in Match & Merge setup. Experience in setting up fuzzy and exact match rules.
• Good understanding of MDM architecture and Informatica MDM Hub console.
• Made use of reusable Informatica transformations, shared sources and targets.
• Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
• Implemented various loads like daily loads, weekly loads, and quarterly loads and on demand load using Incremental loading strategy and concepts of changes Data Capture (CDC).
• Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
• Created mappings for Type1, Type2 slowly changing dimensions (SCD) / complete refresh mappings.
• Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, TO_DATE, Decode, and IIF functions in Expression Transformation.
• Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
• Worked with "pmcmd" command line program to communicate with the Informatica server, to start, stop and schedule workflows.
• Extensively used UNIX Shell Scripts to automate the jobs.
• During the project, participated in multiple meetings with the client and data architect / ETL architect to propose better strategies for performance improvement and gather new requirements.
• Extensively created mappings in TALEND using tMap, tJoin, tParallelize, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
• Extensive experience in using Talend features such as context variables, triggers, connectors for Database and flat files like tMySqlInput, tMySqlConnection, tOracle, tMSSqlInput, TMSSqlOutput, tMSSqlrow, tFileCopy, tFileInputDelimited, tFileExists.Experience in using cloud components and connectors to make API calls for accessing data from cloud storage (Google Drive, Salesforce, Drop Box ) in Talend Open Studio. Environment: Informatica Power Center 10.1.0, Informatica Developer 10.1.0, Oracle 11g, Flat Files, Putty, XML, UNIX, Netezza, SFDC.
No skills were added
Remove Skill
Software Developer
Information Technology
Oct 2015 - Feb 2016
West Des Moines, IA
Athene USA provides products in the retirement savings market, including retail and fixed indexed annuity products as well as institutional products, such as funding agreements. The project consisted to retrieve some sensitive data, perform many queries manipulation, create reports and perform auditing services before inputting the data in the data warehouse for further analysis. Responsibilities:
• Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process using Informatica.
• Created / updated ETL design documents for all the Informatica components changed.
• Extracted data from heterogeneous sources like oracle, xml, Postgre, flat file and perform the data validation and cleansing in staging area then loaded in to data warehouse in oracle 11g.
• Made use of various Informatica source definitions viz. Flat files and Relational sources.
• Made use of various Informatica target definitions viz. relational data base targets.
• Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows using PowerCenter to load the data from source to stage, stage to persistent, stage to reject and stage to core.
• Made use of various PowerCenter Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Rank, Router, Sequence generator, Union and Update Strategy transformations while creating mapplets/mappings.
• Made use of reusable Informatica transformations, shared sources and targets.
• Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
• Implemented various loads like daily loads, weekly loads, and quarterly loads and on demand load using Incremental loading strategy and concepts of changes Data Capture (CDC).
• Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
• Created mappings for Type1, Type2 slowly changing dimensions (SCD) / complete refresh mappings.
• Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, TO_DATE, Decode, and IIF functions in Expression Transformation.
• Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
• Worked with "pmcmd" command line program to communicate with the Informatica server, to start, stop and schedule workflows.
• Created Job Stream and added job definitions in Control-M and executed
• During the course of the project, participated in multiple meetings with the client and data architect / ETL architect to propose better strategies for performance improvement and gather new requirements. Environment: Informatica Power Center 9.5, Oracle 11g, Postgre, XML, Flat Files, Win7, DbVisualizer, Control-M, Toad and Putty
No skills were added
Remove Skill
Software Developer
Information Technology
Jun 2010 - Aug 2013
In this project, it builds data analytics to analyse and improvise with the upcoming changes driven by day to day activities. Responsibilities:
• Developed ETL mappings, Transformations and Loading using Informatica Power Center 8.6.1.
• Extensively used ETL to load data from Flat file, MS Excel, which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 10g.
• Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.
• Worked on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
• Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, update strategy, Lookup, sequence generator, joiner, Stored Procedure.
• Analyzed the session, event and error logs for troubleshooting mappings and sessions.
• Provided support for the applications after production deployment to take care of any post-deployment issues. Environment: Informatica 8.6.1, UNIX shell scripting, Oracle 10g, SQL Programming, MS Excel, SQL *Plus.
No skills were added
Remove Skill
Edit Skills
Non-cloudteam Skill
Education
Computer & Information Science
Sacred Heart University 2015
Information Technology
JNTU 2010
Skills