Address | Email-Id | Telephone
- Around 3 years of Data ware housing experience using Informatica Power Center 9.X/10.x. Good Knowledge on Data Warehousing concepts like Star Schema, Snowflake Schema, Dimensions and Fact tables.
- Currently working in Business Intelligence Competency for Cisco client as ETL Developer
- Extensively used Informatica client tools – Source Analyzer, Target designer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
- Worked with Teradata and Oracle databases and backend as Unix.
- Various kinds of the transformations were used to implement simple and complex business logic.
- According to transformation logic, we used various transformations like Filter, Sorter, Aggregator, Lookup, Expression, Filters, Router, Update Strategy, Joiner, Normalizer etc in the Mappings.
- Expertise in slowly changing dimensions to maintain historical as well as incremental data using Type I, Type II and Type III strategies.
- Basic knowledge on UNIX. Possess knowledge of scheduling tool (Tidal Enterprise solutions) and understanding non-simultaneous dependencies between different functional areas.
- Handled the ad-hoc requests like processing the orders in both Test and Production Environments.
- Interacted with the clients on regular basis to understand and give better solutions to the Business requirements.
- Good technical and communication skills.
- Motivated, able to grasp things quickly with good analytical and problem solving skills.
- B.Tech In Electronics & Communications Engineering College-Location since 2009 to 2011
- Diploma in Information Technology College-Location in 2009
- S.S.L.C in Science stream School-Location in 2007
- Informatica, Unix, teradata
1) ETL Developer
Company Name-Location – May 2017 to Present
Database: Teradata, Oracle
Tools: Informatica Power Center 10.1, Urelease/Udeploy, Tidal Enterprise solution, Citrix
Cisco is one of the Global Market Leaders in manufacturing of network equipment. Cisco maintains its own Data Warehouse. All the customers performs transactions and the data is stored in oracle database which is OLTP data. Using Informatica data from Oracle is migrated to Teradata using Teradata SQL queries. My major stream of development is Bookings, which is the heart of CISCO. Majorly there will be different streams of Bookings data like ERP, POS, ADJUSTMENTS, XAAS etc., Out of these my stream is concerned on POS.
Key Contribution Areas:
- Understanding the Business and scope requirements from Customer completely.
- Carrying out development and deployment activities for both the History and Incremental code respectively in Dev instances.
- Understanding the Client Requirements and summarizing this release scope based on the functionality of the project.
- Writing Unit test cases on the related scenarios to be tested such as Count, attribute checks, performance, functional, business logic checks and so on.
- Carrying sanity checks for the code migrated during the deployment and then loading the History and Incremental data.
- Validation of the History data for all the new tables populated to the 3NF tables as per the flow.
- Comparing and Validating the old code and flow with the new code.
- Providing IT validations signoff for both the history and incremental data for QA end.
2) Informatica ETL Developer
Company Name-Location – July 2015 to April 2017
Database: SQL DEVELOPER
Tools: Informatica Power Center 9.6.1, 10.1, Winscp, Putty
CDW is one of the important projects which provides the complete 360 degree view of GE Customers. This application integrates the downstream data for few GE Businesses to track the business of top customers of GE.
Key Contribution Areas:
- Analyze and understand the Technical Specification Document (Source to Target matrix) for all mappings and clarify the issues with Data Architects.
- Extensively used Informatica client tools – Source Analyzer, Warehouse designer, Mapping designer, Mapplet
- Designer, Informatica Repository Manager and Informatica Workflow Manager.
- Developed different mappings by using different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc to load the data into staging tables and then to target.
- Monitored transformation processes using Informatica Workflow monitor.
- Provided few enhancements to the existing etls as per the requirement.
- Migrated code from one environment to other.
Areas of Expertise:-
- Domain Knowledge: Data Warehousing
- Tools: Informatica Power Center 10.1, Artifactory, Rally, Tidal Enterprise solution URelease/Udeploy, SSH Tectia
- Operating Systems: Windows, Unix
- Database: Teradata 15, Oracle 12, SQL Developer 10g
- Languages: SQL, PL-SQL.