Developer Resume

Teradata Developer Resume Samples

DANIEL PAUL

Email: paul123@gmail.com
Phone: (555)-555-5555


CAREER OBJECTIVES

Currently associated with Delta Technologies as senior BI Analyst. 10+ years of experience in Software development and business analysis in Data warehouse BI.

Worked on the below mentioned skills.

  • Teradata Development
  • Data warehousing (RLDM, FSLDM, CLDM)
  • Experience in using Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, and Exposure to T pump on UNIX/Windows environments and running the batch process for Teradata.
  • Strong Data Warehousing experience in Application development & Quality Assurance testing using Informatica Power Center 9.1/8.6(Designer, Workflow Manager, Workflow Monitor), Power Exchange, OLAP, OLTP
  • Experience in creating complex Informatica Mappings using Source Qualifier, Expression, Router, Aggregator, Lookup, Normalize, and other transformations in Informatica and well versed in debugging an Informatica mapping using Debugger.
  • Efficient development skills in Teradata SQL Objects, Creating Tables, Stored Procedures, Functions, Views, Indexing, Performance Tuning.
  • Experience in Designing and Deployment of Reports for the End-User requests using Cognos, Teradata and Excel.
  • Able to work effectively with remote locations including onsite offshore stakeholder teams.
  • Create quality compliant code along with providing guidance to developers.
EDUCATION
Course (Stream)/ Examination
Institution/University/School
Year of Passing
Performance
BTech Tagore College of Engineering and Technology – Chennai 2007 85%
HSC Scread Higher Secondary School 2002 74%
SSLC ST.Marry Higher Secondary School

2000

70%

SKILLS
  • TERADATA (8 years)
  • Business Analysis
WORK EXPERIENCE
1.) Teradata Developer

Company Name-Location – May 2015 to July 2016

Responsibilities:

As a Teradata Developer was responsible for

  • Involved in requirements gathering and data gathering to support developers in handling the design specification.
  • Extracted data from various source systems like Oracle, SQL Server and flat files as per the requirements.
  • Extensive experience in writing and executing BTEQ scripts for validation and testing of the sessions, data integrity between source and target database and for report generation.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MLOAD scripts and FASTLOAD scripts.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
  • Created proper Primary Index (PI) talking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Write numerous BTEQ scripts to run complex queries on the teradata database.
  • Created Mappings and scheduled Workflows using Informatica.
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fastload, Multiload) and working with loader logs.
  • The mappings involved extensive use of transformations like Aggregate, Filter, Join, Expression, Lookup, Update Strategy, Expressions, Sequence Generator Transformations.
2.) Teradata Developer

Company Name-Location – November 2012 to April 2015

Responsibilities:

  • Teradata Application Development: Timely development, testing & deployment of deliverable based on the requirements
  • Work in conjunction with the other team members in gathering detailed reporting requirements
  • Implemented Pull/Push techniques to access various format of files (.csv, .xls, .dat, .txt) from Source System
  • Developing a good understanding of the operations/ processes of all relevant departments and provide analytical support as needed
  • Understanding the data/system needs for the business and develop integrated systems for creating reports to be presented to management
  • Developed Teradata M-Load scripts for stage loads
  • Teradata Development based on the business requirement and solution using multiple Teradata Utilities like Bteq, FLOAD, FEXPORT, MLOAD.
  • Development as part of change requests and application enhancements.
3.) Programmer

Company Name-Location – April 2008 to November 2011

Responsibilities:

  • Developing a good understanding of the operations/ processes of all relevant departments and provide analytical support as needed.
  • Teradata Performance Tuning: Avoiding unnecessary joins on production tables during peak period
  • Create quality compliant code along with providing guidance to developers.
  • Conduct detailed technical review of code artifact and ensure alignment of the code to requirements and quality standards.
  • Raise relevant risks and issues to the project manager for either challenges with technical solutions or delivery timelines.
  • Teradata Application Management including weekend support to batch runs.
  • Development as part of change requests and application enhancements.
  • Teradata Performance Tuning.
  • Automate current manual processes.
  • Teradata Pre Production Testing
  • Production support for the complete application including Teradata VR12, Merchandise Planner.
  • Knowledge Sharing carried out both for onshore & offshore.
ADDITIONAL INFORMATION

Technical:-

  • Operating Systems Windows, UNIX
  • RDBMS Teradata, PLSql
  • Tools and Utilities Bteq, Mload, Fload, Fast Export, SQL Assistant
  • Scripting Language UNIX Shell Scripting
  • Visualization Tool Qliksense, PowerBI.

Functional:-

  • As a BI Analyst, involved in every phase of software development like Liaising with client to gather requirements, Preparing Functional Requirement Specification, Walkthrough of the FRS with Client to agree on the requirements, Providing Solution, Preparing Design Documents, Implementing the logic, Integration of code, proving unit test cases and validating the same.
  • Organized meetings with stakeholders to gather business requirements, create task backlog, scope/size tasks, prioritize tasks and allocate BI resources.