Uploaded File
add photo
Ravi
ravikanth26782@gmail.com
904-923-7284
Jacksonville, FL 32290
Mainframe/Hadoop Engineer
11 years experience W2
0
Recommendations
Average rating
93
Profile views
Summary

  • 10 + years of excellent hands-on experience in system Analysis, development and enhancements using SDLC methodology in Mainframes technology & Analyst including 2+ years of experience with Big Data/Hadoop Ecosystem.
  • Around 2+ years of experience in Hadoop infrastructure which include Hive, Scoop, Flume, Hbase, HDFS, Yarn, Map Reduce.
  • Have excellent experience in working with mainframe technologies like COBOL, JCL, DB2, CICS, IMS DB/DC, TSO, VSAM, SPUFI, PDS, VSAM, Message Brokers and DB2 Stored Procs.
  • Experience in implementing in setting up standards and processes for Hadoop based application design and implementation.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Good knowledge of Spark, Spark-SQL & Apache Kafka.
  • Involved in the preparation/maintenance of the traceability matrix to keep track of the effects each change recommended by the client would have on the initial requirements on the analysis, design, coding, and testing phases.
  • Involved in proof of concept projects involving data transfer from Mainframe to Hadoop.
  • Expertise in using Changeman Tool- for checking out programs & testing the components .
  • Expertise in Using Endevor Tool for changing programs and testing these in different testing Levels.
  • Expertise in using the Software Configuration Management (SCM) tool for creating and maintaining the components.
  • Expertise in using the SDF2 Tool to modify the CICS BMS online screens.
  • Expertise in using Toad tool for DB2 queries and running the Stored DB2 Stored Procs.
  • Expertise in using Toad tool for Oracle.
  • Very good experience in using MQ series queues to read the messages and write the messages.
  • Have very good experience in Transportation & Health Insurance Domain.
  • Involved in many projects which needs Oracle and mainframes changes. Have excellent experience supporting Oracle teams.
  • Have solid planning and organizational skills in all aspects of development.
  • Strong Interpersonal and communication skills, ability to work in a team as well as independently with minimal supervision.
  • Hands-on Experience in creating and analyzing test plans, test strategies and test data.
  • Good Knowledge of NoSQL database MongoDB.
  • Participate in daily Scrum meetings with the team, walkthroughs and defect tracking meetings.
  • Provided support for system testing and implementation.
  • Experience in Involved in meetings to gather information and requirements from the clients.
  • Reliable, responsible, hardworking and good team player.

Experience
Mainframe/Hadoop Engineer
Transportation
Nov 2014 - present

Team Size : 10 Working as a Mainframe/Hadoop Engineer in the Transportation Project for LMS (Locomotive Mgmt. System) LOCOGPS & GIS system which deals with different types of Locos, pings and geographical Locations of Latitude-Longitude in different locations. Representing North America's freight railroads and Amtrak. Strives to help make the rail industry increasingly safe, efficient and productive. I was involved in developing the existing LMS application upon client request and also involved in Designed Distributed Architecture for new data. LMS System (Locomotive Management System) is that which maintains and provides timely accurate physical description on the national fleet. These equipment contains Locomotives Locomotive Features and also GPS pings from these locomotives etc. Job Responsibilities:

  • Collaborated with business partners and team members to develop specifications, implement business solutions, and resolve problems.
  • Run Hive queries in Hadoop and run queries in DB2 on mainframe to verify all the data is loaded into Hadoop.
  • Worked on importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
  • Worked effectively as a team with peers, end-users, and management to ensure process of deliverables at all stages of the project life cycle.
  • Managed task allocation and team activities.
  • Effectively handled the role of an Onsite coordinator and ensured that a high quality product is always delivered to the customer.
  • Prepared the error handling document to maintain the Error handling process.
  • Managed the team to get the incident tickets and problem tickets fixed during the conversion.
  • Imported all the Locomotive specific data to Hadoop using Sqoop component of Hadoop.
  • Analyzed the existing core mainframe application and also identifying the rules defined on DB2 database.
  • Planned and coordinated testing across multiple teams, tracked and reported status, created testing cycle plan, scenarios etc.
  • Participated in knowledge sharing sessions with the end customers about the new version.
  • Troubleshoot data issues, validated result sets, recommended and implemented process improvements.
  • Responsible for analyzing Data Pipeline to load data from sources such as IBM Mainframes and Oracle using sqoop along with Kafka and Spark Frameworks as per the requirements.
  • Supported the team in all phases of the project with business and technical expertise.
  • Supported very complex mainframe and Hadoop applications effectively.
  • Prepared Reports and Excel information for the Deployment of Mainframe to Hadoop Code.
  • Involved in the Defect analysis call for UAT environment along with users to understand the data and to make any modifications if suggested by the user.

Environment: Cloudera Hadoop, HDFS, Hive, Sqoop, Flume, Apache Kafka, UNIX, Cloudera Manager, Hbase, SQL, IBM Mainframe with Z O/S, COBOL, JCL, IMS, DB2, CICS, FOUCS, Stored Procs, Changeman, IBM DB2 Data Studio, GTB, Fault Analyzer, JIRA.

UNIX Hadoop Sqoop Hive Hbase HDFS Flume Spark UAT SQL Oracle Kafka JCL IMS IBM Mainframe DB2 Cobol CICS CHANGEMAN Apache z/OS JIRA Cloudera Hase FOCUS Stored Procedure IBM Data Studio Database Design
Remove Skill
Senior Developer/Analyst
Information Technology
Oct 2013 - Oct 2014

Team Size: 6 Working as a Developer in the Health Insurance domain at Florida Blue which deals with the Consumer Information details for Health, Vision & Dental process. Florida Blue is one of the North Americas top most Health Insurance Company & I was involved in developing & Enhancing of existing CIP (Consumer Information Platform) system upon client request and also involved in Designed Distributed Architecture. CIP : Elaborating in detail about CIP which involves in Enrollment data extract of different consumers coming from different sources like Federal Employee Program, Florida Health Care Plan, Health Plan Services (HPS), Alliance Highmark & Comp Benefits group Job Responsibilities:

  • Involved in requirement, analysis and understanding of business requirements to identify the flow of information and analyzing the existing accounting systems from daily to weekly to monthly process.
  • Designed Distributed Architecture for new data of enrolling Customers every day through CIP and Obama Care Plans etc.
  • Involved actively in the gathering the business requirements from the clients and preparing the system requirements accordingly.
  • Involved actively in the system analysis, design, coding, unit, regression, integration testing, and implementation of various complex modules.
  • Involved in batch program documentation, online program documentation and preparation of the job flow, program flow and screen flow diagrams for the existing system.
  • Involved in the preparation of technical specifications and test plan documents for the development of new programs having complex functionality.
  • Created DB2 setup for test as well as for production environments for CIP system in Enterprise Data Warehouse (EDW) division.
  • Created a new batch program using COBOL DB2 to process the weekly reporting entries present in the Transport Reporting table.
  • Modified the existing batch job using JCL to execute the new COBOL DB2 program.
  • Used new VSAM file to write the error weekly reports from new COBOL DB2 program.
  • Used Changeman tool for check in and migrate the new batch program and modified job.
  • Use IBM Debug Tool for doing unit testing of new batch program.
  • Schedule meetings regularly with clients to review the design specs, code modified and test results and get the authorization in each stage.
  • Used IBM personal Communications for doing the existing system analysis and modify the existing source components and testing as well.
  • Used FTP & Sterling Process to receive files from third party vendors for processing consumers.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the changes.
  • Worked closely with Mainframe support staffs to conduct 'turn over' at the end of the assignment. 'Turn-over' activity is intended to communicate all necessary information required to perform post-implementation production support.

Environment: IBM Mainframe with Z/oS, COBOL, JCL, IMS, DB2, TOAD, Endevor, File-Manager, IBM Debug Tool, Control-M Tool.

Business Requirements System Analysis System Requirements DB2 Requirement Analysis Cobol CHANGEMAN BMC Control-M Data Warehousing FTP ENDEVOR VSAM TOAD Technical Specifications Regression Testing JCL Integration Testing IMS IBM Mainframe Analysis Project Management Production Support Sterling
Remove Skill
Developer/ Analyst -I
Information Technology
Jan 2011 - Oct 2013
Team Size : 8 Working as a Developer/ Analyst in the Transportation Project CSX which deals with the transportation of Freight and Passenger to different locations. CSX is one of the North Americas top most Transportation company &I was involved in Enhancement of existing Car accounting system upon client request and also involved in Designed Distributed Architecture. Car Accounting: Elaborating in detail about Car Accounting which is an accounting system which contains the daily, weekly & monthly process which deals with Claims & Reclaims with the System, Foreign & Privates Cars maintained by CSX Transportation. Job Responsibilities:
• Involved in requirement, analysis and understanding of business requirements to identify the flow of information and analyzing the existing accounting systems from daily to weekly to monthly process.
• Designed Distributed Architecture for new data for Car Accounting and Revenue, Movement of Train Data etc.
• Involved actively in the gathering the business requirements from the clients and preparing the system requirements accordingly.
• Involved actively in the system analysis, design, coding, unit, regression, integration testing, and implementation of various complex modules.
• Involved in batch program documentation, online program documentation and preparation of the job flow, program flow and screen flow diagrams for the existing system.
• Involved in the preparation of technical specifications and test plan documents for the development of new programs having complex functionality.
• Created DB2 setup for test as well as for production environments for Transportation system in Enterprise Data Warehouse (EDW) division.
• Created a new batch program using COBOL DB2 to process the weekly reporting entries present in the Transport Reporting table.
• Modified the existing batch job using JCL to execute the new COBOL DB2 program.
• Used new VSAM file to write the error weekly reports from new COBOL DB2 program.
• Used Changeman tool for check in and migrate the new batch program and modified job.
• Use IBM Debug Tool for doing unit testing of new batch program.
• Schedule meetings regularly with clients to review the design specs, code modified and test results and get the authorization in each stage.
• Used IBM personal Communications for doing the existing system analysis and modify the existing source components and testing as well.
• Used FTP & Sterling Process to send files to third party vendors.
• Created SAS utilities to scan credit card numbers in mainframe SAS datasets and report their names to the authorized business users in case the utility detects credit card numbers.
• Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the changes.
• Worked closely with Mainframe support staffs to conduct 'turn over' at the end of the assignment. 'Turn-over' activity is intended to communicate all necessary information required to perform post-implementation production support.
• Co-coordinated with the off-shore teams and mentored junior developers. Environment: IBM Mainframe with Z O/S, COBOL, JCL, IMS, DB2, CICS, Stored Procs & Message Broker, TOAD, Changeman, File-Manager, IBM Debug Tool.
System Analysis TOAD Project Management DB2 SAS Data Warehousing System Requirements Production Support FTP Sterling
Remove Skill
Information Technology
Mar 2007 - Mar 2008
Team Size : 6 Position : Junior Developer/ Analyst Joined the existing Enterprise Data System (EDS) team as Mainframe developer and successfully designed, developed and Enhanced there Mileage Master data (MMD) & Early Warning data (EWD). Suggested key Enhancements to their existing data which resulted in building more efficient transportation strategies. Analyzed, identified, fixed bad data and imported data from TransForce for the existing Mileage Master & Early warning process system. Mileage Master is a process where we calculate the miles. When a car is traveled from Origin to Destination the total miles are taken from starting point to ending point. Based on the mileage the payment is done to all the class one Railroads for the train traveled with the corresponding territories. Early Warning (EW) is a process where the cars which may require repair or replacement of a part based on the warning to the car. Once the Warning is alerted the car is sent to the car shop to check the quality checks based on the required parts to the car. Job Responsibilities:
• Attended business meeting with the end clients and vendor's to analyze the scope of the migration and the project life cycle from end to end process. Designed and developed processes to support data quality issues and detection and resolutions of error conditions.
• Working with the Business Analysts and the QA team for validation and verification of the development.
• Involved in analysis of business requirements provided by the client.
• Involved in High Level and Low Level Design.
• Prepared Unit test plan and unit test data for the newly developed components.
• Performed Unit Testing.
• Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule.
• Analyzed the spool information logs, bad files and error tables for troubleshooting.
• Provided effective design solutions for the business requirements. As a senior module developer, reviewed the work done by the team and organized well to ensure timely delivery and high quality product to the customer.
• Involved in preparation of JCL documentation which includes instructions to set up the jobs and execution of jobs in System test environment.
• Prepared Work Effort estimations for on time delivery of the project.
• Followed Mainframe recommendations, methodologies and best practices.
• Worked on Active ShipCSX which is a reporting dashboard at CSX to monitor the daily reports.
• As a Module SME, coordinated smooth flow of communication between onsite and offshore resources, allocated work and monitored the progress.
• Performed volume testing to improve the performance of a job and brought down the run time from 13 hours to 3 minutes and won Client appreciation for the same.
• Prepared knowledge sharing documents for effective transition of the project to service team.
• Ensured timely and defect free delivery of various deliverables of the project.
• Created new Screens using GTB and CICS and created Online Transactions.
• Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues. Environment: IBM Mainframe with Z O/S, COBOL, JCL, IMS, DB2, CICS, GTB, Changeman.
No skills were added
Remove Skill
Edit Skills
Non-cloudteam Skill
Education
not provided
Record has not been verified.
Skills
Database Design
2021
6
DB2
2021
5
CHANGEMAN
2021
3
Cobol
2021
3
Data Warehousing
2014
3
FTP
2014
3
IBM Mainframe
2021
3
IMS
2021
3
JCL
2021
3
Production Support
2014
3
Project Management
2014
3
Sterling
2014
3
System Analysis
2014
3
System Requirements
2014
3
TOAD
2014
3
Apache
2021
2
CICS
2021
2
Cloudera Hase
2021
2
Flume
2021
2
FOCUS
2021
2
Hadoop
2021
2
Hbase
2021
2
HDFS
2021
2
Hive
2021
2
IBM Data Studio
2021
2
JIRA
2021
2
Kafka
2021
2
Oracle
2021
2
SAS
2013
2
Spark
2021
2
SQL
2021
2
Sqoop
2021
2
Stored Procedure
2021
2
UAT
2021
2
UNIX
2021
2
z/OS
2021
2
Analysis
2014
1
BMC Control-M
2014
1
Business Requirements
2014
1
ENDEVOR
2014
1
Integration Testing
2014
1
Regression Testing
2014
1
Requirement Analysis
2014
1
Technical Specifications
2014
1
VSAM
2014
1
Big Data
0
1
IBM Websphere MQ
0
1
IMS DBDC
0
1
MapReduce
0
1
MongoDB
0
1
Scrum
0
1
SPUFI
0
1