Uploaded File
Amrendra
akumarbi@yahoo.com
720-257-9643
Littleton, CO 80120
Big Data Engineer
16 years experience W2
0
Recommendations
Average rating
22
Profile views
Summary

Have 15+ years of experience in SDLC with key emphasis on the trending Big Data Technologies

  • Spark, Scala, Hadoop, Hive, Netezza, Snowflake, Pentaho DI Tableau, Python, Panda & different RDBMS systems. Worked and build multiple enterprise level systems and successfully delivered it. Experience of working on architect, design & develop Big Data Solutions practice including set up Big Data roadmap, build supporting infrastructure and team to provide Big Data. Accomplished in cross-functional collaboration and leading through influence with stakeholders, including product, marketing, finance, sales, external customers, and joint development partners. Created and executed customer-focused value chain optimized roadmaps end-to-end from ideation to deployment. Have 15+ years of experience in SDLC with key emphasis on the trending Big Data Technologies
  • Spark, Scala, Hadoop, Hive, Netezza, Snowflake, Pentaho DI Tableau, Python, Panda & different RDBMS systems. Worked and build multiple enterprise level systems and successfully delivered it. Experience of working on architect, design & develop Big Data Solutions practice including set up Big Data roadmap, build supporting infrastructure and team to provide Big Data. Accomplished in cross-functional collaboration and leading through influence with stakeholders, including product, marketing, finance, sales, external customers, and joint development partners. Created and executed customer-focused value chain optimized roadmaps end-to-end from ideation to deployment.

Experience
Big Data Consultant
Information Technology
Jun 2019 - present

Comcast designed the roadmap to decommission Netezza Customer Analytics warehouse because of vendor support issue & hardware aging. Goal of this project is to create hybrid warehouse solution using Teradata, Hadoop (Hive) and Snowflake computing. Query fabric, Presto views will bring Teradata, Hive on one platform. Application is reading data from different (steaming) sources like Kibana, Kafka, HDFS, RDBMS, S3. Project is estimated to write more than 800 transformations and jobs. Total of 400+ objects will be creating in all three warehouses. Project is estimated to end in Dec- 2021 Tools & Technologies: Spark/Scala, NiFi, Shell Scripting, Python, Panda, Hive, Beeline, Sqoop, BETQ, fastload, mload, nzload. Customer Experience &

Big Data Hadoop HDFS Hive Spark Sqoop Teradata Kafka Scala
Remove Skill
Big Data Consultant
Information Technology
Jan 2017 - May 2019

Comcast got vision to build a new high performing system for marketing and campaign team to expand their business. By using this system, business wants to study the behavior of customers to expand the business and customer value. Project is using HDFS file system, Hive Tables and Snowflake Computing for data storage. Warehouse is getting data streams from different internal systems and third-party system. For data ingestion and transformation of data, project is using Spark (Scala), Python (Pandas) HiveQL, NiFi, Shell Scripting. Total count of jobs/transformations are more than 600 & it is processing around 10+ billions of records every day. Project has created 250+ objects in both warehouses. Tools & Technologies: Spark/Scala, Python /panda scripting, Shell Scripting, NiFi, HiveQL, Snowflake computing, HDFS. Customer Experience &

Big Data HDFS Hive Spark Scala Hadoop
Remove Skill
Big Data/Netezza Consultant (Admin)
Information Technology
Apr 2014 - Dec 2016
Comcast Cable, Denver, CO Goal of this project is to migrate data from current Enterprise MS SQL warehouse and old legacy Netezza Twin Fin cluster to new Striper Netezza MPP appliance. Project is using differnt tools to transform and migrate data from various internal and external systems. Project has following ETL tools - Pentho DI & SSIS, Python scripting and NiFi. Worked on multiple source system, like Teradata, Oracle, MySQL, Flat File, Hadoop File System, Mongo-DB, kibana, curl, Kafka Streaming etc. Warehouse is doing daily processing of billions of data from different system, which ingest into Netezza Appliance. Total counts of views/tables etc. are 200+. Along with Migration, project also involves conversion of Tableau report to Netezza warehouse. Also worked as data-modeler to deliver and implement logical and physical data model using Netezza best practice. Other part of project is to design Data Model using Netezza Pure Data Analytics best practices. Worked on setting-up the box, which involves creation of resource groups, history database, client configuration, creation of group and permissions, backup setup, UDF configuration etc. Tools & Technologies: nzload, nz_groom, nz_backup, nz_migration Fluid query, Pentaho DI, NiFi, Python /panda scripting, Shell Scripting . Data Management Platform: Data Architect
Big Data Data Architecture Database Backups ETL Hadoop MongoDB MySQL Netezza Oracle Shell Scripts SQL SSIS Tableau Teradata
Remove Skill
Data Architect & ETL Consultant
Apr 2007 - Jun 2008
Asurion Client has decided to streamline the whole system and environment and want to unify the technology. Different teams are using different ETL tools to build the warehouse system. During first phase of this project, client has planned to migrate all datastage ETL jobs into Microsoft SSIS . Worked on 150+ "Datastage" package to replicate into SSIS. Another goal is to build a new warehouse using Enterprise MS-SQL Server and Business Intelligence dashboard using Business Object (Crystal Report). Tools & Technologies: SSIS, MS-SQL Server, Crystal Report, Erwin, VB Scripting, Stored Procedure Enterprise Data Warehouse: Data Architect &
Business Intelligence Crystal Reports Data Architecture Data Warehousing Erwin Data Modler ETL Scripting SQL Server SSIS Stored Procedure
Remove Skill
ETL Consultant
Feb 2007 - Apr 2007
The goal of this project is to create a functionally rich business intelligence platform that will support all reporting and analytical needs for the Merrill Lynch US (MLUS) servicing business. Work includes data modeling using Erwin, designing ETL jobs using Informatica, writing stored procedures in T-SQL and Unix scripting. Tools & Technologies: Informatica, MS-SQL Server, Unix Scripting, Erwin Recognition: Awarded for timely delivery of work in tight deadline and tough working condition. Enterprise Data Warehouse: Data Architect &
Business Intelligence Data Architecture Data Modeling Data Warehousing Erwin Data Modler ETL Informatica SQL Server Stored Procedure T-SQL
Remove Skill
ETL Consultant
Information Technology
May 2006 - Jan 2007
Worked on different data marts and build reports using Business Object (Crystal Report). Worked with DBAs and Data Architect team to design the whole warehouse on MS-SQL Server 2012. ETL jobs are designed using SSIS and VB Scripts. Tools & Technologies: SSIS, MS-SQL Server, Crystal Report, Erwin, VB Scripting, Stored Procedure Aurora Loan Services Reporting
Crystal Reports Data Architecture Data Marts Erwin Data Modler ETL SQL Server SSIS Stored Procedure
Remove Skill
Data Architect and Report Designer
Information Technology
Aug 2005 - Apr 2006
Asurion Worked on migration of 100+ excel report and macros to Business object Crystal Report. Worked on data models, query optimization and stored procedure writing. Worked closely with DBA to identify slow running queries and optimize the performance. Tools & Technologies: MS-SQL Server, DTS, Crystal Report, VB Scripting, Stored Procedure
Crystal Reports Data Architecture DBA Microsoft Excel Scripting SQL Server Stored Procedure
Remove Skill
Database Developer
Information Technology
Jan 2005 - Jul 2005
Worked on data modeling, query optimization, data mart development, and other database related development. Worked on customizing the existing report and migrate it to the new system.
Data Marts Data Modeling
Remove Skill
Edit Skills
Non-cloudteam Skill
Education
Information Technology Management
Western Governors University 2020
Studies focused
IGNOU 2008
Minor: IT Management
Studies focused on operations and technologies
The course includes Data Structures 2003
Studies focused on computer fundamental and latest technology
The course includes MS-SQL Server
Certifications
Netezza Pure Data Analytics
Microsoft MS 70-270 (Install, Configure, Administer & Troubleshoot)
Snowflake Computing Associate
Skills
Big Data
2021
5
Data Architecture
2016
5
ETL
2016
4
Hadoop
2021
4
SSIS
2016
4
Crystal Reports
2008
3
HDFS
2021
3
Hive
2021
3
Scala
2021
3
Spark
2021
3
SQL Server
2008
3
Stored Procedure
2008
3
Teradata
2021
3
Database Backups
2016
2
Erwin Data Modler
2008
2
MongoDB
2016
2
MySQL
2016
2
Netezza
2016
2
Oracle
2016
2
Scripting
2008
2
Shell Scripts
2016
2
SQL
2016
2
Tableau
2016
2
Business Intelligence
2008
1
Data Marts
2007
1
Data Warehousing
2008
1
DBA
2006
1
Kafka
2021
1
Microsoft Excel
2006
1
Sqoop
2021
1
Data Engineering
0
1
Data Modeling
2007
1
Informatica
2007
1
Python
0
1
T-SQL
2007
1