Uploaded File
add photo
Ashish
tyagii.ashish@yahoo.com
714-653-8183
AWS Cloud Engineer
10 years experience
0
Recommendations
Average rating
1
Profile view
Summary

Experience
AWS Cloud Engineer
Information Technology
Jan 2020 - present
Responsibilities:
  • Migrated application (Cross Account) includes migration of code repository (code commit), DynamoDB, Sage-maker, Cognito, and S3 artifacts.
  • Deployed and launched the frontend application by integrated with CloudFront distribution and the Alexa skill using lambda and Raspberry ISO image in the production environment.
  • Designed and Deployed IoT things in the IoT Core using Cloud9 as the IDE. Based on the attributes, integrated SNS with the things for the real-time updates.
  • Established Intercommunication between things using IoT Greengrass for local processing of telemetry data.
  • Processed Analytics data integrated with the web application for real-time visualization. Used SQL queries to validate data and updated the records for the various modules.
  • Wrote automation scripts in Python for Extracting Data from JSON and XML files.
  • Created CI/CD pipelines and setup auto-trigger, auto build and auto-deployment with the CI/CD tool like Bamboo.
  • Used ETL to develop jobs for extracting, cleaning, transforming and loading data into data warehouses.
  • Involved in the Design, Development & Deployment of web application in C#.NET, ASP.Net, User Controls and deployment in the cloud environment.
  • Resolved update, merge and password authentication issues in Jenkins and JIRA.
  • Installed, Configured, Managed Monitoring Tools such as Splunk, Nagios and CloudWatch for Resource Monitoring/Network Monitoring/Log Trace Monitoring.
  • Implemented Docker -maven-plugin in and Maven POM.XML to build Docker images for all microservices and later used Docker files to build the Docker images from the java, jar files.
  • Automated the frontends platform into highly scalable, consistent, repeatable infrastructure using a high degree of automation using Chef, Jenkins, and Cloud Formation.
  • Real-time streaming the data using Spark with SQS. Responsible for handling Streaming data from web server console logs.
  • Written bash and python scripts, integrating Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMI's and scheduling Lambda functions for routine AWS tasks.
  • Implemented full CI/CD pipeline by integrating SCM (Git) with an automated testing tool Gradle & Deployed using Jenkins and Dockized containers in production. Furthermore, engaged in a few DevOps tools like Ansible, Chef, AWS CloudFormation, AWS Code Pipeline, Terraform, EKS, and Kubernetes.
  • Written Terraform modules for automating VPC's and AWS EC2 Instances, modules for creating VPC and VPN connection from Data Center to the production environment and cross-account VPC peering.
  • Created EC2 instances and implemented large multi-node Hadoop clusters in AWS cloud from scratch using automated scripts such as Terraform. Skills:
  • Python, IoT Data Analytics, IoT Greengrass, Cloud9, IoT Shadow, NoSQL, DynamoDB, SNS, Lambda, AWS Services, Jira, DynamoDB, EKS, Alexa Skills, Cognito, Python, CloudFormation, Route53, CloudFront.
No skills were added
Remove Skill
DevOps Engineer
Information Technology
Jul 2018 - Dec 2019
Responsibilities:
  • Build and configure a virtual data-center in the Google cloud platform to support Enterprise Data Warehouse hosting, including Virtual Private Cloud (VPC), Public and Private Subnets, Security Groups, Route Tables, Google Cloud Load Balancing.
  • Create clusters in Google Cloud and manage the clusters using Kubernetes (k8s). Used Jenkins to deploy code to Google Cloud. Created new namespaces, creating Docker images and pushing them to the container registry of Google Cloud.
  • Configure, monitor, and automate Google Cloud Services and be involved in deploying the content cloud platform using Google compute engine, Google storage buckets.
  • Setup GCP Firewall rules to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations, drastically improving user experience and latency.
  • Involved in maintaining the user accounts (IAM), RDS, S3, Route 53, AWS Lambda, VPC, RDS, Dynamo DB, SES, SQS and SNS EMR services in AWS cloud.
  • Leveraged AWS cloud services such as EMR, auto-scaling and VPC (Virtual Private Cloud) to build secure, highly scalable, and flexible systems that handled expected and unexpected load bursts and can quickly evolve during development iterations.
  • Worked on Auto-scaling Cloud Watch (monitoring), AWS Elastic Beanstalk (app deployments), AWS Lambda, Amazon S3 (storage) and Amazon EBS (persistent disk storage) and Dynamo DB.
  • Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS.
  • Configured Kinesis Shards for optimal throughput in Kinesis Streams for Spark Streaming Applications on AWS.
  • Expertise in building CI/CD on AWS environment using AWS Code Commit, Code Build, Code Deploy and Code Pipeline and experience in using AWS CloudFormation, API Gateway, and AWS Lambda automation and securing the infrastructure on AWS.
  • Created an AWS RDS Aurora DB cluster and connected it to the database through an Amazon RDS Aurora DB Instance using the Amazon RDS Console.
  • Utilized Jenkins master/slave architecture to distribute builds on nodes, and trigger Jenkins job to build the artifacts using Maven, deployed the Terraform templates to create the stack.
  • Implemented a CI/CD pipeline with Docker, EKS, Jenkins, and GitHub by virtualizing the servers using Docker for the Dev and Test environments by achieving needs through configuring automation using Containerization.
  • Implemented AWS Code Pipeline and Created Cloud formation JSON templates in Terraform for infrastructure as code.
  • Extensively involved in infrastructure as code, execution plans, resource graph and change automation using Terraform.
  • Experience on infrastructure as code with Hashicorp Terraform tool by operating AWS infrastructure through Terraform and Terraform Enterprise (TFE) by writing Terraform scripts, modules etc.
  • Converted existing terraform modules that had version conflicts to utilize cloud formation during Terraform deployments to enable more control or missing capabilities.
  • Used Python to automate provision by Terraform for tasks such as encrypting EBS volumes backing AMIs, and scheduling Lambda functions for routine AWS tasks and stored result in Dynamo DB.
  • Worked with container-based deployments using Docker, working with Docker images, Docker hub and Docker registries, installation and configuring Kubernetes and clustering them.
  • Proficient in creating Docker images using Docker File, worked on Docker container snapshots, removing images, and managing Docker volumes and implemented Docker automation solution for CI/CD model.
  • Managed Kubernetes charts using Helm, Created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and Managed releases of Helm packages.
  • In-depth Knowledge and hands-on experience of Kubernetes, including setting up a Kubernetes cluster, managing clusters, creating and managing pods, deployments, application management, rolling updates, exposing using services & routes, persistent volumes and persistent volume claims.
  • Automated the configuration management and deployments using Ansible playbooks. Furthermore, Yaml for resource declaration, and creating roles and updating Playbooks to provision servers using Ansible.
  • Installing, Configured, and management in Ansible Centralized Server and creating the playbooks to support various middleware application servers and configuring the Ansible tower as a configuration management tool to automate repetitive tasks.
  • Created AWS Infrastructure monitoring through Datadog and application performance monitoring through App Dynamics.
  • Experience with installation and configuration of Dynatrace monitoring tool. Furthermore, created email alerts and threshold values using Dynatrace for our environment.
  • Experience in Installing, configuring Cloud Foundry Ops Manager, App Manager, Etc. Configuring LDAP for authorization, configuring log regulator for logs in PCF (ELK/Splunk).
  • Pager-duty integrations with App dynamics, Splunk, Wave-front, Cloud watch, and slack through API's and define escalation policies and schedules.
  • Extensive programming experience in Python, PowerShell, and Bash scripting. Wrote bash and Shell scripts for auto-launch and web server configurations.
  • Designed workflows in Atlassian JIRA to deal with issues and maintained all the user stories for tracking as per Agile style. Skills:
  • Agile, Red-hat, Sun Solaris, Windows, Linux, AIX, SVN, Ant, Maven, Jenkins, Chef, Shell, Unix, Nginx, Tomcat, Ansible, JDK, Puppet, Bugzilla, Perl. DevOps Engineer
No skills were added
Remove Skill
Information Technology
Jan 2017 - Jun 2018
Responsibilities:
  • Developed security policies and processes. Developed views and templates with Python and Django's view controller and templating language to create a user-friendly Website interface to migrate to AWS.
  • Experience in developing scripts using RESTful API models to integrate workflows with AWS.
  • Created functions and assigned roles in AWS Lambda to run python scripts. Used AWS Lambda to perform event-driven processing. Created Lambda jobs and configured Roles using AWS CLI.
  • Created entire application using Python, Django, MySQL and Linux and developed a fully automated continuous integration system using Git, Jenkins, MySQL and custom tools developed in Python and Bash.
  • Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of Open Stack Nodes and Test Playbooks on AWS instances using Python.
  • Used Bash, Python, included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks. Developed entire frontend and backend modules using Python on Django Web Framework.
  • Used Python and Django creating graphics, XML processing, data exchange and business logic implementation.
  • Utilized PyUnit, Pytest and the Python unit test framework for all Python applications. Developed Python framework using various Python libraries like Pandas and NumPy for Data validation.
  • Designed and maintained databases using Python and developed Python-based API (RESTful Web Service) using Flask, SQL Alchemy, and PostgreSQL.
  • Developed the customer complaints application using Django Framework, which includes and Python code.
  • Using Amazon EC2 command-line interface along with Bash/Python to automate repetitive work.
  • Used Python based GUI components for the frontend functionality, such as selection criteria.
  • Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
  • Used REST-based microservices with REST format considering RESTful APIs and outlined, built up the UI for the customer sites by utilizing HTML, CSS.
  • Developed Restful Microservices using Flask and Django and deployed on AWS servers using EBS and EC2.
  • Developed a fully automated continuous integration system using Git, Jenkins, MySQL, and custom tools developed in Python and Bash.
  • Involved in building database Model, APIs, and Views utilizing Python technologies to build web-based applications. Skills:
  • Python, Django, RESTful API, Aws Lambda, Aws CLI, MySQL, Ansible Playbooks, Wrapper, Bash, Python, EBS, Aws AMI, Flask, SQL Alchemy, PostgreSQL, Pandas, NumPy, GUI, Git, Jenkins.
No skills were added
Remove Skill
System Operation Engineer
Information Technology
Jun 2014 - Dec 2015
Responsibilities:
  • Translated the customer requirements into design specifications and ensured that the requirements translate into a software solution.
  • Developed and designed an API (RESTful Web Service). Used the Python language to develop web-based data retrieval systems.
  • Involved in doing AGILE (SCRUM) practices and planning of sprint, attending daily agile (SCRUM) meetings and SPRINT retrospective meetings to produce quality deliverables within time.
  • Worked on Restful web services, which enforced a stateless client-server and support JSON few changes from SOAP to RESTFUL Technology. Involved in the detailed analysis based on the requirement documents.
  • Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL-Alchemy and PostgreSQL.
  • Created complex dynamic HTML UI using jQuery. Automated Regression analysis for determining fund returns based on index returns (Python/Excel). Worked on development of SQL and stored procedures, trigger and function on MYSQL.
  • Developed views and templates with Python and Django's view controller and templating language to create a user-friendly website interface.
  • Worked on JIRA tools for issue tracking, reporting versions, epics, sprints, etc. Used Test driven approach for developing the application and Implemented the unit tests using Python Unit test framework.
  • Used Python based GUI components for the frontend functionality, such as selection criteria. Connected continuous integration system with GIT version control repository and continually build as the check-in's come from the developer
  • Involved in writing SQL queries implementing functions, triggers, cursors, object types, sequences, indexes etc. and Developed and tested many features for dashboard using Python, Bootstrap, CSS.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using.
  • Responsible for delivering datasets from Snowflake to One Lake Data Warehouse and building CI/CD pipeline using Jenkins and AWS lambda and Importing data from DynamoDB to Redshift Batches using Amazon Batch using TWS scheduler.
  • Created and managed all hosted or local repositories through Source Tree's simple interface of GIT client, collaborated with GIT command lines and Stash.
  • Developed and reviewed SQL queries using joins clauses (inner, left, right) in Tableau Desktop to validate static and dynamic data for data validation.
  • Designed and developed components using Python with Django framework. Implemented code in Python to retrieve and manipulate data.
  • Involved in developing the enterprise social network application using Python, Twisted, and Cassandra and responsible for setting up Python REST API framework and spring framework using Django.
  • Actively participated in Object-Oriented Analysis Design sessions of the project based on MVC Architecture using Spring Framework.
  • Maintained and developed complex SQL queries, stored procedures, views, functions and reports that meet customer requirements using Microsoft SQL Server 2008 R2. Skills:
  • Python, Django, Flask, REST API, DynamoDB, AWS Lambda, Redshift, Git, SQL, MYSQL, CSS, PostgreSQL, JSON, Terraform, CloudFormation.
No skills were added
Remove Skill
Python Developer
Information Technology
Feb 2013 - May 2014
Responsibilities:
  • Understanding the client requirement and developing the code using Python scripting.
  • Developing the code by using Python modules.
  • Involved in developing in Login, Register information module.
  • Involved in Design and implementation of project.
  • Involved in creating page templates required for the project.
  • Interacting with team members on technical programming and solving production support issues (such as bug fixes, queries etc.).
  • Prepared documentation for the generated reports. Developed the customer complaints application using Django Framework, which includes and Python code.
  • Profiled python code for optimization and memory management.
  • Implemented locking mechanisms using multi-threading functionality.
  • Involved in the development of main modules like CSV import, bulk content upload.
  • Writing unit test cases wherever required on the application.
  • Performing code optimization. Skills:
  • Python, Django, HTML5, CSS3, Windows, OS, Linux. Technical Support Associate
No skills were added
Remove Skill
Information Technology
Mar 2011 - Mar 2013
Responsibilities:
  • Handle 20-30 calls per day from MicroStrategy customers, partners, and internal personnel.
  • Manage various Technical Support email inboxes, processing all incoming email communications.
  • Create and maintain online accounts for customers and partners to access secure MicroStrategy websites.
  • Process requests from customers and partners for new software license keys when new product versions are made generally available.
  • Issue software license keys and online accounts for new software transactions.
  • Coordinate schedules for 24x7 technical support rotations.
  • Work an adjusted shift schedule, possibly until midnight or later, on our last business day of each quarter to support the completion of processing all new software transactions.
  • Assist our users via email, forums, and social media channels in a variety of topics from account management and technical support.
  • Assist in the moderation of all user generated content.
No skills were added
Remove Skill
Remote/Chat Support
Information Technology
Mar 2011 - Mar 2011
OS, Windows, Excel, Word, Desktop, Networking.
No skills were added
Remove Skill
Edit Skills
Non-cloudteam Skill
Education
Certifications
Udemy: Python Developer
LinkedIn: DevOps Foundations
Udemy: RPA - Automation Anywhere
LinkedIn: Linux Command Line
LinkedIn: Networking Foundations: Protocols and CLI Tools
LinkedIn: Python Data Structures
AWS: Solutions Architect Associate
AWS: Certified Developer - Associate
Skills