Summary
Overview
Work History
Education
Skills
Areas Applications
Timeline
Generic

Manimegalai Sadasivam

Bentonville

Summary

• Total 14 years 4 months experience in analysis, design, development and enhancement

• Good at development and enhancements for Unix Shell/Perl scripting, DataStage, Hadoop, Google Big Query platform applications

• Expertise in Big Data technologies – Hive, Sqoop, Mapreduce, Oozie, Spark, Scala, Pig, Kafka, Flume, Spark streaming, Python, Pyspark, Thoughtspot, Automic Scheduler, Data Modelling(Erwin Tool), Google Cloud – Big Query, Dataflow Streaming Pipeline, Azure Databricks.

• Experience in creating value adds out of opportunities to customers, good at applying strategic methods to achieve IT systems stability, availability, performance, and automation of repeated processes

• Worked in ETL Datastage 8.5,8.7, 9.1,11.5

• Experience in writing shell scripts for importing data using sqoop and TDCH

• Experience in Hive database

• Experience in write scala job in Spark application

• Worked on exporting data from Hive to SQL Server using sqoop export

• Experience in the Performance Tuning and Optimization of the Data Stage Parallel jobs

• Expertise in Oracle 11g , Hadoop,Hive,Teradata and Greenplum

• 8+ years of experience in Agile methodology.

Overview

14
14
years of professional experience

Work History

Developer

Wal-Mart Stores, Inc
11.2023 - Current
  • Involved in Data Ingestion and Data Process and Data Extract Process
  • Loading and transforming data using hive queries
  • Involved in developing Hadoop jobs for processing millions of records of text data
  • Cleansing Data and filtering bad records using Unix Scripting as per the customer needs on the source data
  • Organizing and scheduling Hive file in a workflow Using Automic scheduler
  • Experience in working with generic framework Aorta for the ETL
  • Development of the program/Model using Hive, Sqoop, oozie and JAVA (based on the use case) Programming to bring the data from Kafka Sources to Hadoop
  • Maintaining Proper code phase in version controlling system.(Github) Performing the Unit test and Hand over to the Quality analyst
  • Performing the data models for the tables by following the strict compliance to modelling standards
  • Involved in translating the Teradata Queries to Hive Queries Analysing large data sets of data by running Hive queries and performance tuning.

Developer

Wal-Mart Stores, Inc
02.2023 - 11.2023
  • Involved in Data Ingestion and Data Process and Data Extract Process
  • Importing data from different sources like Teradata, DB2 etc
  • Using TDCH and Sqoop with internal framework AORTA
  • Loading and transforming data using hive queries
  • Involved in developing Hadoop jobs for processing millions of records of text data
  • Cleansing Data and filtering bad records using Unix Scripting as per the customer needs on the source data
  • Organizing and scheduling Hive file in a workflow Using Automic scheduler
  • Experience in working with generic framework Aorta for the ETL
  • Development of the program/Model using Hive, Sqoop, oozie and JAVA (based on the use case) Programming to bring the data from Kafka Sources to Hadoop
  • Maintaining Proper code phase in version controlling system.(Github) Performing the Unit test and Hand over to the Quality analyst
  • Performing the data models for the tables by following the strict compliance to modelling standards
  • Involved in translating the Teradata Queries to Hive Queries Analysing large data sets of data by running Hive queries and performance tuning.

Developer

Wal-Mart Stores, Inc
05.2022 - 01.2023
  • Designed and developed the Airflow workflows, Unix scripts, Hive tables, Pyspark scripts
  • Developed complex transformations using Apache Spark and store the data to Hive tables
  • Debugging Applications using Spark and Airflow logs for analysis on issues
  • Provide timely and appropriate communication to business owners, stakeholders, and users on issue status and resolution
  • Design of a high-quality integrated system that is capable of meeting the requirements of the rationalized interfaces
  • Utilize an effective and efficient technical approach consistent with the defined architecture
  • Worked on the SDV tool for data validation
  • Identify the tables in prod 17 required, transformations required and map to target system for ETL development
  • Worked on creating the shell script for comparing data between Prod 17 and GCP.

Data Warehouse Specialist

Wal-Mart Stores, Inc
02.2020 - 05.2022
  • Gather requirements from Project team stakeholders and do analysis on the requirement
  • Identify and Analyze source systems (Teradata/SQL server/DB2/Oracle) that provides data with respect to requirements
  • Identify and document each data attribute required, transformations required and map to target system for ETL development
  • Prepare LLD and HLD documents that reflect the functional and technical specifications
  • Work closely with data scientists to help create the models/algorithms that forecast business needs
  • Development of Spark Scala scripts to handle the transformations identified in the analysis and mapping document
  • Development of data pipeline to load consumption layer tables for business users using Big data technologies and echo system like HADOOP,HIVE,SPARK.

System Analyst

Wal-Mart Stores, Inc
12.2017 - 01.2020
  • Gather requirements from Project team stakeholders and do analysis on the requirement
  • Design layout for the above requirements are get captured in the design documents such as HLD(High Level Design) and LLD(Low level Design) document which is also called as Functional Specification and Technical specification templates
  • Source to foundation mapping document is prepared in the way it should contain the business rule to map each column from source to target for any required ETL Job
  • Development of HIVE scripts to handle the transformations identified in the analysis
  • Development of UNIX Shell scripts to integrate the ETL components
  • UI environments are created for Development, Testing, UAT and PROD separately and development of Front end code will be done based on the reference of Design documents.

Web Developer

Wal-Mart Stores, Inc
11.2015 - 11.2017
  • Design, Development of walmart datacafe web application
  • Application provides the enterprise solution to the Walmart & Sam's Business people by development a high performance visual representation statistical data of their Sales, Profit, Product movements, etc, as web application portal.

Developer

Target Corporation
10.2014 - 04.2015
  • Create referential inbounds to Soft solutions for various interfaces, which will load the data to soft solutions staging tables and then to core tables
  • Based on the data loaded, soft solutions will calculate the best possible promotions and pricing can be given to guests when compared to competitors’ price.

Developer

Wal-Mart Stores, Inc
09.2013 - 09.2014
  • Calculate Baselines for the Whirlpool Product sold out for the various Retailers
  • This process is achieved by datastage tool to perform the ETL Process
  • The data loaded to target is used to calculate Baselines and used as source for Product Master.

Developer

Chartis
09.2010 - 08.2013
  • Generate tax expense records for the premium records posted by various sources
  • This process is achieved by datastage tool to perform the ETL Process
  • The data loaded to target is used for reporting purposes using Cognos reporting tool.

Developer

AXA UK
01.2010 - 05.2010
  • Migrate the existing application to latest technology and Maintain application effectively, Adhere to TCS and Client Standards, processes and procedures while maintaining the application, Provide value addition by suggesting improvements, building tools etc., Build expertise in technology and process for team members, Deliver the new initiatives on time.

Education

Bachelor of Engineering (Computer Science and Engineering) -

Government College of Technology, Anna University
Chennai, Tamil Nadu
05.2009

Skills

  • Hive Query Language
  • Spark
  • Scala
  • Python
  • Pyspark
  • ETL - Datastage
  • Teradata
  • UNIX Shell Scripting
  • Hadoop
  • Google Big Query
  • Automic Scheduler
  • Azure Databricks
  • SQL
  • Greenplum
  • Agile methodology

Areas Applications

  • Retail Business User Application
  • Insurance - Underwriting

Timeline

Developer

Wal-Mart Stores, Inc
11.2023 - Current

Developer

Wal-Mart Stores, Inc
02.2023 - 11.2023

Developer

Wal-Mart Stores, Inc
05.2022 - 01.2023

Data Warehouse Specialist

Wal-Mart Stores, Inc
02.2020 - 05.2022

System Analyst

Wal-Mart Stores, Inc
12.2017 - 01.2020

Web Developer

Wal-Mart Stores, Inc
11.2015 - 11.2017

Developer

Target Corporation
10.2014 - 04.2015

Developer

Wal-Mart Stores, Inc
09.2013 - 09.2014

Developer

Chartis
09.2010 - 08.2013

Developer

AXA UK
01.2010 - 05.2010

Bachelor of Engineering (Computer Science and Engineering) -

Government College of Technology, Anna University
Manimegalai Sadasivam