Summary
Overview
Work History
Education
Skills
Professional Summary
Certification
Awards
Timeline
Generic

Ram Prakash Bakthinarayanan

Bentonville,AR

Summary

Bigdata Developer with 13+ years of experience in Information Technology involving all the phases of Software Development Life Cycle (SDLC) with a strong Object Oriented Programming background and focuses on emerging and established technologies.

I am currently working as Staff, Data engineer in Walmart and located in Bentonville, Arkansas, USA.

Overview

13
13
years of professional experience
1
1
Certification

Work History

Staff Data Engineer

Walmart
03.2020 - Current

Start date : 03/23/2020-Present

Technologies Used: Apache Spark/Apache Druid/Google Cloud/Scala/Hive/Thoughtspot/Sqoop/Unix shell programming/GPU

  • Design and develop scalable and complex data pipelines using Apache Spark and Hive to load the pre-consumption table
  • Migrated ETL flows that used Spark 2.0 major version to utilize Spark 3.0
  • Utilizing GPU accelerators to enhance the ETL performance
  • Develop SQOOP scripts to load data from various RDBMS to Hadoop
  • Developed and supported datasources in Druid
  • Develop Thoughtspot worksheets, pins and pinboards
  • Developed data pipelines using Data proc as service
  • Design and develop the Automic workflows to execute the developed ETL flows and decide the optimal time for workflow execution
  • Involved in migrating workflows using Airflow from Automic
  • Designed and developed ETL flows to support a web application (MerchOne) used by Merchants in Walmart.
  • Responsible for reviewing the code deliverables in the scrum team.
  • Works closely with product teams and other stakeholders in converting the business requirements into technical implementation documents.

Data Engineer

System Soft Technologies LLC
08.2019 - 03.2020

Start date : 08/19/2019-Present

Technologies Used: Hive/Thoughtspot/Sqoop/Unix shell programming

Client: Walmart labs

  • Design and develop scalable and complex hive queries to load the pre-consumption table
  • Develop SQOOP scripts to load data from Teradata to HDFS
  • Develop Thoughtspot worksheets, pins and pinboards
  • Defect/Bug fixes in non prod and in production environment tracked in Jira through tickets
  • Design and develop the Automic workflows to execute the developed ETL flows and decide the optimal time for workflow execution
  • Track and maintain performance metrics and resource usage for key application processes using Resource manager and Ambari

Senior Data Engineer

Cognizant Technology Solutions Corp
02.2019 - 07.2019

Start date: 02/25/2019 End date: 08/16/2019

Client: Walmart labs

Technology Stack: Apache Hive/Apache Spark/yaml/Shell scripting/Jira/Github/Google cloud/Microsoft Azure

Application: Datalake project

The primary objective is to handle load the data from various sources like Oracle, Teradata, DB2 and Informix into datalake, Azure and Google cloud platform.

  • Design and develop near realtime data streaming using Kafka and PySpark programs after loading the sufficient history data
  • Develop Hive programs to transform the data and load in Datalake
  • Performing ETL by ingesting data from Kafka into Hive
  • Develop yamls to push data from datalake to Azure and Google cloud platform
  • Develop Sqoop to pull data from various sources
  • Develop Workflows and jobs in Automatic scheduler to execute the yamls
  • Prioritize work based on Jira tickets
  • Used Github as version control

Bigdata ETL Developer

Tata Consultancy Services
03.2016 - 01.2019

Start date: 03/01/2019 End date: 02/22/2019

Client: Kaiser Permanente

Technology Stack: Apache Hive/Pyspark/Spark SQL/Apache Oozie/Oracle PLSQL/Perl/Shell programming/AWS

Application: MBE (Medicare Business Engine)

The primary objective of MBE is to handle Medicare members throughout Kaiser. Membership, benefits and CMS compliance letters are the major areas in MBE. The data from MBE applications are ingested in datalake.

  • Performed analysis on existing systems, recommended enhancements, and implemented strategies to streamline current processes
  • Develop Sqoop and Hive scripts to load data from Oracle to datalake(Hadoop)
  • Worked with Oracle GoldenGate and Kafka for streaming data
  • Develop spark programs to find the data quality after loading into staging tables
  • Develop scripts to ingest the data into cloud buckets in AWS.
    Cloud Platform from datalake
  • Develop dashboards using Tableau based on business requirements
  • Developed User-defined functions in Python for complex transformations
  • Develop shell scripts to invoke the hive scripts
  • Co-ordinate with scheduling team to schedule unix jobs
  • Represented development team at the client meetings
  • Work closely with clients to establish problem specifications and system designs
  • Developed complex database views in Oracle PLSQL which then used by sqoop to extract the data
  • Performed performance tuning on existing views in Oracle SQL
  • Work in onsite-offshore model and leads the team for delivery

Database and ETL Developer

Tata Consultancy Services
09.2015 - 02.2016

Client: Kaiser Permanente

Technology Stack: Oracle PLSQL/Unix/Informatica

Application: Claims Xcelys/Diamond

The primary objective of Xcelys and Diamond are to handle Claims processing, claim adjudication and member benefit accumulation.

  • Develop and maintain Oracle DB jobs running in LINUX OS which are scheduled through Tivoli Workload scheduler
  • Developed and maintain ETL workflows for generating reports using Informatica 9.1

Database And J2EE Developer

Tata Consultancy Services
01.2012 - 08.2015

Client: Kaiser Permanente

Technology Stack: Oracle PLSQL/Unix/Java

Application: Claims Xcelys/Diamond

The primary objective of Xcelys and Diamond are to handle Claims processing, claim adjudication and member benefit accumulation.

  • Developed code fixes and enhancements for inclusion in future code releases and patches in Oracle PLSQL.Design,Development,Maintenance and support of Diamond application
  • Involved in the architecture and design of the application using J2EE Design Patterns like Observer, Builder, singleton
  • Used IBM MQ for receiving messages from various interfaces
  • Used IBM TWS for scheduling batch jobs

Junior J2EE Developer

Tata Consultancy Services
07.2011 - 01.2012
  • Trained in the basic and advanced concepts of core Java & Java EE along with frameworks like Hibernate, Spring and SOA
  • Extensively worked on front end, business, and persistence tiers using the struts frameworks
  • Involved in design, development, and testing phases of software development life cycle
  • Developed presentation layer using Java script, struts tag libraries like logic, html, bean, etc. in JSP pages
  • Used GitHub as Version Control System

Education

Bachelor of Engineering - Computer Science

Thiagarajar College Of Engineering
Madurai, Tamil nadu, India
2011

Skills

  • Big data and Hadoop framework
  • Apache Spark
  • Google cloud platform
  • GPU processing
  • Shell scripting
  • Apache Hive
  • Apache Druid
  • Thoughtspot
  • AWS
  • Apache Oozie
  • Apache PIG
  • Python
  • Oracle PL/SQL
  • Informatica 91
  • Python
  • Core Java
  • MongoDB
  • Apache Pinot
  • Snowflake

Professional Summary

  • Over 13 years of programming experience in Bigdata technologies,Hive, Spark sql , Sqoop, Oracle Database , shell scripting, Informatica ,Java/JEE development and Google Cloud platform
  • Experience in building data pipelines in Google cloud
  • Experience in utilizing the GPU accelerators for improving the ETL pipelines
  • Experience in developing PIG and Hive scripting
  • Experience in PySpark and Spark SQL
  • Experienced in Apache Druid
  • Experience in Apache Kafka
  • Strong knowledge in Unix shell programming
  • Experience in developing complex DB objects
  • Experience in performance tuning and scalability
  • Experienced in Oracle GoldenGate
  • Participated in the Agile SDLC (standups, estimation, iterative development, continuous integration, demos, retrospectives)
  • Experience in leading a small high-functioning agile team with development having a heavy emphasis on crisp delivery of the user stories created by the product owner and stake holders
  • Experience with version control systems like GitHub, Starteam and RTC
  • Experience in using BMC remedy and Jira
  • Experience in developing workflows and jobs in Automatic scheduler and Tivoli Workload scheduler
  • Intermediate knowledge in Front-end technologies


Certification

Oracle PLSQL developer certified Associate

America's Health Insurance Plan's certified Professional,Academy for Healthcare Management(PAHM)

Coursera certified Hadoop Platform and Application Framework

Edureka certified Big Data and Hadoop Developer

TCS certified BigData and Hadoop Ecosystems Foundation

Edureka certified Apache Spark and Scala developer

MongoDB university - completed MongoDB basics (M001) course.

Tableau - Desktop Specialist certified

Awards

  • Awarded as the best team player for the year 2012. Received this award for individual contribution for Kaiser Permanente - Claims Diamond for making key contribution towards the success of the project
  • Awarded as the best team player for the year 2013. Received this award for automating a manual process and thereby saving a million dollars per year
  • Best team award for the year 2014 by Ajoyendra Mukherjee EVP & Head, Global HR
  • Special Initiative award for 2014 by Ajoyendra Mukherjee EVP & Head, Global HR
  • Special Initiative award for 2015 by Ajoyendra Mukherjee EVP & Head, Global HR
  • Best technical lead award for 2017 by Ajoyendra Mukherjee EVP & Head, Global HR. Received this award for guiding the junior colleagues throughout the organization in TCS

Timeline

Staff Data Engineer

Walmart
03.2020 - Current

Data Engineer

System Soft Technologies LLC
08.2019 - 03.2020

Senior Data Engineer

Cognizant Technology Solutions Corp
02.2019 - 07.2019

Bigdata ETL Developer

Tata Consultancy Services
03.2016 - 01.2019

Database and ETL Developer

Tata Consultancy Services
09.2015 - 02.2016

Database And J2EE Developer

Tata Consultancy Services
01.2012 - 08.2015

Junior J2EE Developer

Tata Consultancy Services
07.2011 - 01.2012

Bachelor of Engineering - Computer Science

Thiagarajar College Of Engineering
Ram Prakash Bakthinarayanan