Uploaded File
Taimur
taimur0613@gmail.com
732-788-6396
New Brunswick, NJ 08989
Tech Lead/Lead Developer
16 years experience W2
0
Recommendations
Average rating
122
Profile views
Summary

  • 12+ years of work experience in ETL Development, Data Visualization and Data Warehousing using cutting edge technologies.
  • 3+ years data visualization experiences using Tableau, QlikView and Power BI.
  • 8+ years of strong experience in Data Warehousing and ETL using Informatica PowerCenter 10.1/9.1/9.0.1/8.x/7.x/6.1, Power Exchange 9.1/8.6/8.1, Oracle 11g/10g/9i, Teradata 13/12/V2R6 and Erwin.
  • 2 years of experience in IBM Netezza data warehouse development using Netezza Database 7.2.1.
  • 3 years of experience on real-time data Warehouse development using CDC tool Informatica PowerExchange 9.1/8.6/8.1.
  • Experience in Data Warehouse/Data Mart Development Life Cycle using Dimensional modeling of STAR, SNOWFLAKE schema, OLAP, ROLAP, MOLAP, Fact and Dimension tables, and Logical & Physical data modeling using ERWIN 7.5/4.2 and MS Visio.
  • Having Business Intelligence experience using OBIEE 11g/10g, Business Objects XI R2, MS Access Reports.
  • Extensive experience in using Oracle 11g/10g/9i, DB2 8/7, MS SQL Server 2008/2000, Teradata 13/12/V2R6, MS Access 7.0/2000, Erwin, XML, SQL, PL/SQL, SQL*Plus, SQL*Loader and MS SQL Developer 2000, Win 7/XP and Sun Solaris.
  • Worked with Teradata loading utilities like Multi Load, Fast Load, TPump and BTEQ.
  • Extensively worked on Oracle Function, Cursor, Store Procedure, and Package & Trigger.
  • Experience on data modeling and create LDP and PDM for Star schema and Snowflake schema using MS Visio and ERWIN 7.1/4.5.
  • Exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation and production support.
  • Excellent working knowledge of UNIX shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
  • Experience in preparing ETL design documentations, user test case documentation and standard operating procedures (SOP) documentation.
  • Experience of working on onsite – offshore module.
  • Worked as Agile Scrum call facilitator and track status and discussion using BaseCamp and Rally.
  • Extensive experience on effort estimation, distribution of work, track & update status report, team coordination and update client.
  • Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member.
  • Providing on Production Support, Resolution & Closure of Trouble issues tickets.
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.
  • Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1

Experience
Tech Lead/Lead Developer
Information Technology
May 2018 - present
San Francisco, CA

  • Developed and advocate the development standards and practices for the development team, i.e. coding, code management, and documentation.
  • Developing new workflow components for Salesforce system.
  • Design & develop highly efficient/high performance ETL mappings/workflows.
  • Design Develop and Test ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.1.
  • Develop Informatica code, Design, Develop and modify Informatica mappings and workflows.
  • Involved with management in terms of supplying input for key design and architecture decisions, as well as work estimation and resource planning.
  • Applied policies, procedures and standards relating to BI ETL process development, in conjunction with Data Warehouse best practices.
  • Managed the Salesforce.com CRM application including ongoing support requests and administrative.
  • Hands on configuration of Service Cloud/Salesforce Platform including users, roles, security, profiles, workflow rules, custom objects, etc.
  • Developed reports, dashboards, and processes and monitor data quality and integrity. Execute data migration/cleansing projects.
  • Worked with various functions and end users to identify, document, and communicate standard business processes as they relate to Salesforce.
  • Proactively identified and implement operational improvements, enhancements, and system customizations that meet business requirements.
  • Utilized SFDC to improve processes and productivity and make recommendations to support an organization scaling at a rapid pace.
  • Perform configuration and customization of the Salesforce.com platform.
  • Participated in efforts to develop and execute testing, training and documentation.
  • Utilized best practices to perform operational support, enhancements, and bug fixes as needed to the Salesforce.com platform.
  • Provided technical assistance and end user troubleshooting for bug fixes, enhancements, and “how-to” assistance.
  • Managed and support the Salesforce offshore team.

Environment: Informatica Power Center 10.1, Oracle 11g, NoSQL, NZSQL, UNIX Shell Scripting, SQL, PL/SQL, TOAD, MS Access, MS Visio, Utilities BTEQ, FLOAD and TPUMP, Putty, WinScp, WinCvs, Linux, Tableau, Ultra Edit

MongoDB Oracle SQL TOAD UNIX
Remove Skill
Data Engineer/Lead Developer     
Information Technology
Jul 2017 - Apr 2018
San Francisco, CA

  • Developed ETL programs using Informatica to implement the business requirements.
  • Perform data mapping of source-to-target data sets.
  • Loaded the aggregate data into a relational database for reporting, dash boarding and ad-hoc analysis.
  • Overseeing the inbound and outbound interfaces development process by closely working with functional, developer and tester.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Worked on Informatica Power Canter tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Worked with data scientist to create table in Hive to run data models.
  • Conversion EDW to Big data.
  • Implemented Pass Through, Auto Hash, User defined Hash Key and Data Base Partitions for performance tuning.
  • Analyzed Bulk Load option, third party Loaders suggested by Informatica.
  • Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data.
  • Extracted data from legacy systems into staging area using ETL jobs & SQL queries.
  • Developed the SQL scripts and Procedures for the business rules using Unix Shell and NZSQL for Netezza.
  • Assessed the Netezza environment for implementation of the ELT solutions.
  • Structural harmonization is done by extracting the data from different staging tables into alignment area table by integrating multiple source tables from staging into alignment schema table.
  • Performed Requirement Analysis, Designing and Creating Data Services Layer in DENODO Express and feeding it to downstream Dashboard systems.
  • Created custom DENODO views by joining tables from multiple data sources.
  • Designed and developed high-quality integration solutions by using DENODO virtualization tool (read data from multiple sources including Oracle, Hadoop, MySQL)
  • Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameters using Tableau.
  • Strong Dashboard design experience and passionate practitioner of effective data visualization. Familiarity with best practices around visualization and design.
  • Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
  • Experience in Agile Methodology project execution model and expertise in coordinating the teams between multi locations.
  • Created Jira ticket, code check in to SVN branch using Tortoise.

Environment: Hadoop, Python, Spark, Hive, Informatica Power Center 10.2/10.1, Oracle 11g, Netezza, NoSQL, NZSQL, DENODO, UNIX Shell Scripting, SQL, PL/SQL, TOAD, MS Access, MS Visio, Utilities BTEQ, FLOAD and TPUMP, Putty, WinScp, WinCvs, Linux, Tableau, QlikView, Tortoise, Ultra Edit

Big Data Data Mapping Data Visualization Data Warehousing ETL Hadoop Hive Informatica Informatica Powercenter Oracle PL/SQL QlikView Spark SQL Tableau TOAD
Remove Skill
Data Engineer/Lead Developer
Information Technology
Nov 2014 - Jun 2017
Jersey City, NJ

  • Involved and proficient in defining and validating protocols for clinical studies and handling trial responsibility throughout the data-management lifecycle.
  • Worked on CED-EDC source system (Electronic Data Capture) and database design and hypothesis.
  • Support clinical trials for CRO (Contract Research Organizations) by providing meticulous data management. Design and maintain databases, queries, reports, and graphics and data-analysis tools; perform data entry, check reviews, database audits and coding; and define and validate study protocols.
  • Worked on Oracle Clinical development on the design, testing and implementation of study databases.
  • Develop clear clinical data sets enabling the standardized collection and analysis of massive amounts of cross-boundary data content in a timely manner and with a high level of accuracy.
  • Track progress of clinical studies, ensuring projects meets timelines and quality expectations.
  • Oversee data-management lifecycle of large clinical trials, composing and verifying reports and results.
  • Strong exposure and Involved in writing Simple and Complex SQLs, PL/SQL Functions and Procedures, Packages and creation of Oracle Objects - Tables, Materialized views, Triggers, Synonyms, User Defined Data Types, Nested Tables and Collections.
  • Interacted with the business users, collected the requirements, analyze the requirements, design and recommend solutions
  • Extensive worked on using SQL, PL/SQL, ORACLE Database, and many other ORACLE facilities, such as Import/Export, SQL*Loader and SQL*PLUS.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
  • Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Prepared BRS (Business Requirement Specifications) document that gives the detailed information about the requirements.
  • Good understanding of database objects and ability to triage issues.
  • Involved in PL/SQL code review and modification for the development of new requirements.
  • Created materialized views required for the application.
  • Involved in handling the changes in compiling scripts according to the database changes.
  • Developed stored procedures to extract the data from different sources and load it into data warehouse.
  • Analyzing the data and Mapping the data requirements developing Stored Procedures, Functions and Triggers.
  • Involved in uploading of the data from flat files into Databases and validated the data with PL/SQL procedures.
  • Maintaining daily batch cycle and providing 2ss4-hour Production support.
  • Preparation of the Test Cases and involvement in Unit Testing and System Integration Testing.
  • Utilized SQL*Loader to load flat files into database tables.
  • Created SQL*Loader scripts to load data into temporary staging tables.
  • Worked on ETL process of data loading from different sources and data validation process from staging area to Actavis data warehouse.
  • Worked with ETL team involved in loading data to staging area to data warehouse. Provided all business rules for the database for loading data.
  • Proficient in ETL (Extract – Transform – Load) using SQL Server Integration Services 2012(SSIS) and Informatica Power Center tool.
  • Reviewed the Data-stage mapping to check the proper implementation of Business Rules, Load Testing for final deployment in production.
  • Responsible for system design concerning data integration and preparation of Technical Design Document (TDD).

Environment: Informatica Power Center 10.1/9.5/9.1, Oracle 11g, UNIX Shell Scripting, SQL, PL/SQL, TOAD, MS Access, MS Visio, Utilities BTEQ, FLOAD and TPUMP, Putty, WinScp, WinCvs, Linux, QlikViews, SCM, Rally, UltraEdit, BizTalk, Dell Boomi.

Data Cleansing Data Integration Data Mapping Data Validation Data Warehousing ETL Hbase Informatica Informatica Powercenter Java MapReduce MongoDB Oracle PL/SQL QlikView SQL SQL Server Stored Procedure TOAD Triggers UNIX
Remove Skill
Sr. Informatica Developer
Information Technology
Apr 2014 - Oct 2014
Marlborough, MA
  • Interacting with client on a regular basis to discuss day-to-day Issues and matters.
  • Provide support for Informatica workflow/mapping (developed used Informatica 9.1.0) which are running into
  • Production environment at client location.
  • Conduct training and Knowledge Transfer (KT) session onsite and offshore developers and testers on domain and functional areas.
  • Used clinical trials data and compound data to lunch blockbusters drugs more quickly, frequently and cost effectively.
  • Persistent data quality to ensure the ongoing of clinical data and compound data.
  • Integrate clinical trials data and compound data from operational and analytical application to enterprise data integration.
  • Configured flexible data model for all clinical trails data.
  • Ensuring top-quality deliverables from HCL to the client.
  • Provide support for code developed using Data Warehouse Administration Console (DAC) in order to achieve scheduling for ETL jobs.
  • Involved in data analysis and handling the ad-hoc request by interacting with business analyst, client and customers and resolve the issues as part of production support.
  • Reviewing project/task, status/issues with the HCL offshore team and ensuring completion of project on time.
  • Developing UNIX shell script for automation and enhancing/streamlining existing manual procedure used at client location.
  • Involved in the development of Informatica mappings and preparation of design document (DD), technical design document (TDD) and unit acceptation testing (UAT) documents.
  • Actively participating in proving technical proposal for upgraded existing ETL and OBIEE code at client locations (In order to make use of advanced features of Informatica newer version).
  • Making use of various HCL proprietary frameworks and techniques for requirements gathering and business process maps for understating the current process.
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.
  • Created test plans, test data for extraction and transformation processes and resolved data issues following the data standards.
  • Strong Data analysis to ensure accuracy and integrity of data in the context of Business functionality.
  • Worked on Informatica developer tool IDQ of latest versions 9.1.0 and 9.5.1.

Environment: Informatica Power Center 9.5/9.1, Informatica Power Exchange 9.1/8.6, IDQ 9.5.1, Oracle 11g, UNIX Shell Scripting, SQL, PL/SQL, TOAD, MS Access, MS Visio, Tidal, Utilities BTEQ, FLOAD and TPUMP, Putty, WinScp, WinCvs, Linux, BaseCamp, SCM, Rally, SmartCapa, UltraEdit.

Data Analysis Data Integration Data Warehousing ETL IDQ Informatica Informatica Developer Informatica Powercenter OBIEE Oracle PL/SQL Ruby on Rails SQL TOAD UNIX
Remove Skill
Sr. Informatica Developer
Information Technology
Aug 2013 - Mar 2014
Marlborough, MA

  • Developed technical specifications of the ETL process flow.
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in SQLServer and Oracle.
  • Worked on various issues on existing Informatica Mappings to Produce correct output.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Analyzed existing system and developed business documentation TRD on changes required.
  • Analyzed existing mapping and Reverse Engineering created DLD.
  • Analyzed existing Health Plan issues and Re Design on change required.
  • Involved in the Unit Testing, Event & Thread Testing and System testing.
  • Involved in gathering of business scope and technical requirements and created technical specifications.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Performed unit and integration testing in User Acceptance Test (UAT), Operational Acceptance Test (OAT), Production Support Environment (PSE) and Production environments.
  • Created HLD, LLD, UTC doc and Migration documents.
  • Designed and developed Workflows as per ETL Specification for Stage load and Warehouse load.
  • Worked with production support systems that required immediate support.
  • Monitor and tune ETL processes for performance improvements; identify, research, and resolve data warehouse load issues.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.

Environment: Informatica Power Center 9.5/9.1, Informatica Power Exchange 9.1/8.6, Oracle 11g, UNIX Shell Scripting, SQL, PL/SQL, TOAD, MS Access, MS Visio, Tidal, Utilities BTEQ, FLOAD and TPUMP, Putty, WinScp, WinCvs, Linux, BaseCamp, SCM, Rally. BizTalk, Dell Boomi, Cognos Framework Manager.

Cognos Data Warehousing ETL Informatica Informatica Developer Informatica Powercenter Oracle PL/SQL SQL Stored Procedure TOAD UNIX
Remove Skill
Sr. Informatica Developer
Information Technology
Oct 2011 - Jul 2013
New York City, NY

  • As a lead member of ETL Team, responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, administration and mentoring.
  • Participated in business analysis, ETL requirements gathering, physical and logical data modeling and documentation.
  • As Scrum Master I manage: Stand-ups, Backlogs, sprint Planning Meetings.
  • Delivery of quality solutions on time, within budget using approved scheduling tools, techniques and methodologies has been critical to success. Accustomed to working well with internal and external stakeholders at multiple levels, navigating the organization, successfully completing complex projects under tight deadlines.
  • Doing self and peer review for Informatica and oracle objects.
  • Designing the data transformation mappings and data quality verification programs using Informatica and PL/SQL.
  • Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.
  • Designed Reusable Transformations, Mapplets, and reusable Tasks and designed Worklets as per the dependencies of various sessions and parent-child tables.
  • Worked on performance tuning of Informatica code. Extensively work on customization of cache, partitioning, push down optimization and transformation tuning.
  • Created PowerExchange registration and configured in PowerCenter to load data in real-time mode.
  • Worked on Teradata utilities BTEQ, MLOAD and TPUMP and tuned SQL.
  • Created Oracle stored procedure, package and triggers. Worked on analytical query to format report. Created materialized view to store summarized data.
  • Investigate, debug and fix problems with Informatica Mappings and Workflows.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Performing ETL, Unix script and database code migrations across environments.
  • Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables. Partitioned sessions and used incremental aggregation for fact load.
  • Participated in Decision support team to analyze the user requirements and to translate them to technical team for new and change requests.
  • Performed unit and integration testing in User Acceptance Test (UAT), Operational Acceptance Test (OAT), Production Support Environment (PSE) and Production environments.
  • Created HLD, LLD, UTC doc and Migration documents.

Environment: Informatica Power Center 9.1/8.6, Informatica Power Exchange 9.1/8.6, Oracle 11g, UNIX Shell Scripting, SQL, PL/SQL, TOAD, MS Access, MS Visio, Autosys, Teradata 13, Utilities BTEQ, FLOAD and TPUMP, Putty, WinScp, DB2 Mainframe, Linux, BaseCamp, SCM, Rally.

Data Modeling ETL Informatica Informatica Developer Informatica Powercenter Oracle PL/SQL Scrum SQL SQL Server Stored Procedure Teradata TOAD Triggers UNIX XML
Remove Skill
Informatica Developer  
Information Technology
Jan 2011 - Sep 2011

  • Designed and developed Logical/physical Data Model, Forward/Reverse engineering Using Erwin 7.2.
  • Designed and developed Workflows as per ETL Specification for Stage load and Warehouse load.
  • Providing on call production support and efficiently tracked heat-tickets, timely resolving prod issues and proactively escalation (if appropriate), resolution and closure of trouble issues and tickets.
  • Designed ETL functional specifications and converting them into technical specifications.
  • Interacted with management to identify dimensions and measures.
  • Review source systems and propose data acquisition strategy.
  • Developed ETL methodology to custom fit the ETL needs of sales.
  • Data collection and transformation mappings and design of the data warehouse data model.
  • Responsible for developing the mappings for the pre-existed procedure in the CDW as we were removing the procedures from the cloned CDW.
  • Responsible for the data analysis of the target systems as there are target systems, which are dependent on CDW data.
  • Delivered the test plan, test specification and test report document as per the Guidant system document management system.
  • Coordinated with team members for collecting information about system testing of the clone CDW.
  • Created Web services source and targets. Customized web services mappings and workflows. Extensively worked on XML source, targets and transformations.
  • Actively involved in coordinating all the testing related issues with the end users and the testing team.
  • Executed Test Cases to ensure the product meets the specifications and the Life Cycle Services.
  • Worked on Data Analysis reports Business Objects.
  • Developed shell scripts for job automation, which will generate the log file for every job.
  • Prepared Run books, migration documents and production monitoring and support handbook for daily, weekly and monthly processing.

Environment Informatica PowerCenter 8.6, Informatica PowerConnect, Informatica PowerExchange 8.6, Oracle 11g, SQL * Loader, Data Pump, TOAD, Business Objects XI/R2, MS SQL Server 2005, Sun Solaris, UNIX, Windows NT 4.0, MQ series

Data Analysis Data Warehousing Erwin Data Modler ETL IBM Websphere MQ Informatica Informatica Developer Informatica Powercenter Oracle SQL SQL Server TOAD UNIX WebServices XML
Remove Skill
ETL Developer
Information Technology
Oct 2009 - Dec 2010
Santa Clara, CA

  • Involved in Requirement gathering and studying of Source-Target mapping document
  • Involved in Data transfer from OLTP systems forming the extracted sources.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Analyzed the sources, transformed the data, mapped the data and loading the data into targets using Power Center Designer.
  • Designed and developed Oracle PL/SQL Procedures.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Participated in the design of Star & Snowflake schema data model.
  • Tested Informatica workflows using various transformations for extracting data from flat files, oracle and loading aggregated data into staging and target data mart.
  • Involved in creation of Test Cases, Test Scripts and logging of defects using Test Director.
  • Developed the materialized views, organized indexes to improve the performance of SQL queries.
  • Tested Informatica mappings and also tuned them for better performance and implemented various Performance and tuning techniques.
  • Responsible to tune ETL procedures and Star Schemas to optimize load and query performance.
  • Written stored procedures and triggers.
  • Performed System testing, Regression testing, functional testing manually.
  • Created Test plans that describe each test in minute detail.
  • Developed defect tracking report using Test Director.
  • Created MQ series Source and Target and used XML parser and XML generator.

Environment Informatica Power Center 8.6, Informatica PowerExchange 8.6, Oracle 11g, XML Files, SQL*PLUS, SQL*Loader, TOAD, Windows 2000, UNIX, Import/Export Utilities, MQ Series, Shell Scripts.

Data Cleansing Data Conversion ETL ETL Developer IBM Websphere MQ Informatica Informatica Powercenter Oracle PL/SQL SQL Stored Procedure TOAD Triggers UNIX XML
Remove Skill
ETL Developer
Information Technology
Nov 2007 - Sep 2009
Saint Louis, MO

  • Involved in creation of Logical Data Model for ETL mapping and the process flow diagrams.
  • Worked with SQL developer to write the SQL code for data manipulation.
  • Worked on Informatica versioned repository with check in and checkout objects feature.
  • Used Debugger extensively to validate the mappings and gain troubleshooting information about data and error conditions.
  • Provided guidance to less experienced personnel. Conducted quality assurance activities such as peer reviews.
  • Participate in the business analysis process and the development of ETL requirements specifications.
  • Worked with production support systems that required immediate support.
  • Develop, execute and maintain appropriate ETL development best practices and procedures.
  • Assisted in the development of test plans for assigned projects.
  • Monitor and tune ETL processes for performance improvements; identify, research, and resolve data warehouse load issues.
  • Involved in unit testing of the mapping and SQL code.
  • Developed mappings to load data in slowly changing dimensions.
  • Involved in performance tuning of source & target, mappings, sessions and workflows.
  • Worked on Teradata various utilities like BTEQ, FLOAD and created procedures.
  • Worked with connected, unconnected lookups and reusable transformations and mapplets.
  • Utilized Unix Shell Scripts for adding the header to the flat file targets.
  • Involved in designing the star schema and populating the fact table and associated dimension tables.

Environment: Oracle 11g, SQL Developer, SQL, Informatica Power Center 8.1, Sybase, Windows XP, Visio 2000, Business objects XIR2, ESP, SCM, Putty, WinScm, Teradata V2R5.

Data Warehousing ETL ETL Developer Informatica Informatica Powercenter Oracle SQL Teradata UNIX
Remove Skill
ETL Developer
Information Technology
Jun 2005 - Oct 2007
Minneapolis, MN

  • Developed complex mappings to extract source data from heterogeneous databases SQL Server Oracle and flat files, applied proper transformation rules and loaded in to Data Warehouse.
  • Involved in identifying bugs in existing mappings by analyzing data flow, evaluating transformations using Debugger.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Worked closely with Production Control team to schedule shell scripts, Informatica workflows and PL/SQL code in Autosys.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures etc.
  • Defects were tracked, reviewed and analyzed.
  • Conducted UAT (User Acceptance Testing) with user community.
  • Developed K-shell scripts to run from Informatica pre-session, post session commands.
  • Extracted data from VSAM file and XML files.
  • Involved in Data transfer from OLTP systems forming the extracted sources.
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Analyzed the sources, transformed the data, mapped the data and loading the data into targets using Power Center Designer.
  • Designed and developed Oracle PL/SQL Procedures.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Participated in the design of Star & Snowflake schema data model.

Environment Informatica PowerCenter 7.1/6.1, Oracle 10g, SQL Server 2000, Erwin 3.5, XML, TOAD, HP–Unix 11.11, Harvest, Sun Solaris, DB2 Mainframe.

Data Cleansing Data Conversion Data Warehousing Erwin Data Modler ETL ETL Developer Informatica Informatica Powercenter Oracle PL/SQL SQL SQL Server Stored Procedure TOAD XML
Remove Skill
Edit Skills
Non-cloudteam Skill
Education
Bachelor's in Information System Management
City University of New York 2005
Skills
Oracle
2021
13
SQL
2021
13
ETL
2018
12
Informatica
2018
12
Informatica Powercenter
2018
12
TOAD
2021
11
Data Warehousing
2018
9
PL/SQL
2018
9
UNIX
2021
9
Stored Procedure
2017
8
SQL Server
2017
7
Data Cleansing
2017
6
ETL Developer
2010
5
Triggers
2017
5
XML
2013
5
Data Conversion
2010
3
Data Integration
2017
3
Data Mapping
2018
3
Informatica Developer
2014
3
MongoDB
2021
3
QlikView
2018
3
Teradata
2013
3
Data Validation
2017
2
Erwin Data Modler
2011
2
Hbase
2017
2
Java
2017
2
MapReduce
2017
2
Data Analysis
2014
1
Data Modeling
2013
1
IBM Websphere MQ
2011
1
Scrum
2013
1
Agile Methodology
0
1
Big Data
2018
1
Cognos
2014
1
Data Visualization
2018
1
Hadoop
2018
1
Hive
2018
1
IDQ
2014
1
OBIEE
2014
1
Ruby on Rails
2014
1
Spark
2018
1
Tableau
2018
1
WebServices
2011
1