Contact No : 1234567891,
E-mail : email@example.com,
Present Address : Chennai.
- 11 years of core experience in Big Data, Automation and Manual testing with E-commerce and Finance domain projects.
- 1-year experienced Bigdata professional with the tools in Hadoop Ecosystem including HDFS, Sqoop, Spark, Kafka, YARN, Oozie, and Zookeeper.
- Experience in manipulating/analyzing large datasets and finding patterns and insights within structured and unstructured data.
- Knowledgeable in all testing standards, testing terminologies, and various methodologies used while testing applications and Knowledge in TDD, ATDD, XP and BDD.
- Experience in transferring the data using Sqoop from HDFS to Relational Database System and vice-versa.
- Working experience in Agile Methodology’s Scrum Model and Agile values & Principles.
- Experienced in writing complex spark programs that work with different file formats like Text, Sequence, CSV and parquet.
- Extensive Experience on importing and exporting data using stream processing platforms like Kafka.
- Accomplished ISEB ISTQB certificate with pre-eminence
- Plan, Design, Build and Maintain Test Automation Framework’s.
Course (Stream)/ Examination
Year of Passing
|B.E. in Computer Science & Engineering||College Name-Location||2011||75%|
- Big Data/Hadoop Technologies HDFS, YARN, Sqoop, Spark, Kafka, Zookeeper and Oozie.
- Languages Java, Scala, C#
- Testing Tools Selenium Webdriver, Microsoft VS Coded UI, Visual Load Test, JMeter, Protractor, TestNG, Junit, Cucumber, Extend Reports.
- Frameworks Page Object Model, Data Driven, BDD.
- Virtualization Suite Microsoft Virtual PC, Virtual Box, VM Ware
- Operating Systems Windows 8, 10, Android, iOS, Linux.
- Databases Oracle 8.0/9i, MS SQL Server.
- Development Tools Eclipse, IntelliJ.
- Other Tools TFS, MTM, Visual Source Safe (VSS), Bugzilla, GIT, Browser Stack, Splunk, Maven, Log4J, SQL.
1.) Spark Developer
Company Name-Location – October 2012 to Present
Roles & Responsibilities:
- Used Spark-Streaming APIs to perform necessary transformations and actions on the fly for building the common learner data model which gets the data from HDFS and Persists into HDFS.
- Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD’s.
- Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark.
- Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.
- Worked on Installation and configuring of ZooKeeper to co-ordinate and monitor the cluster resources.
- Uses Log4j and Splunk for logging and log analysis.
- Deploy Spark application in Unix system using Putty and cmd.
- Importing and exporting data into HDFS and Oracle using Sqoop.
- Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
2.) Selenium web driver
Company Name-Location – 2011
Roles & Responsibilities:
- Understanding the requirements and Scope of the Load tests.
- Setup the local Infrastructure with help of networking team.
- Configured the Test environment for Controller and Agents to generate the virtual load.
- Created the test scripts and executed on various platforms as and when required.
- Executing tests before and after fix and making comparisons/trends report.
- Analyze and preparing a well-versed report with detailed explanations.
- Executing Performance and Load tests for the website.
- Self-Initiated the batch data analysis and given the POC to the business to have the insights.
- Successfully convinced business to take up the data analytics project.
- Achieved ISEB-ISTQB Certification foundation level with 93% score.
- Have got several instances of appreciation mails from manager for preparing detailed QA artifacts having maximum test coverage.
- Successfully prepared Automation various applications and generating daily reports.
- Efficient in performing with the team and as an individual as well.
Spark Developer Resume Example 2
- With 18+ years of total industry experience & Seasoned IT professional with over 10 years of experience in Business Intelligence field. I am a highly motivated and a self-starter individual with excellent communication and inter personal skills.
- Over the period of last 10 years, I have built and lead teams on Big Data analytics, numerous data warehousing, data conversion and enterprise ODS projects
- Have experience in delivering projects with Big data/Hadoop with technologies like HDFS, Hive, SPARK, Sqoop and Scala.
- Good understanding of Big Data eco-system and technology stack
- Have experience in delivering projects in Agile/Scrum methodologies in the recent projects.
- Delivery using hands-on and proven expertise on Data modeling, ETL and BI domains with the gamut of tool expertise.
1.) Senior Project Manager
Company Name-Location – November 2006 to August 2018
Technologies: Hadoop, SPARK 2.2, Hive, Scala, Green plum, DB2, Teradata, Informatica Power Center 10 and SQL
- Overseeing multiple projects across various accounts
- Participated and reviewed effort estimations for various proposals
- Provide inputs on the delivery aspects in the contract to limit financial risk to the company.
- Project Scoping by facilitating requirement gathering interviews, conduct workshops, issue clarification.
- Plan for required budget and resources for successful execution of the project
- Prepare resource plan, build teams by evaluating and aligning right resources to the project.
- Tracking and monitoring the status of the project.
- Highlight project risks and manage the risks with mitigation plan in place.
2.) Senior Developer
Company Name-Location – January 2000 to October 2006
Technologies: Microsoft .Net Framework, C#, ASP. Net, MS-SQL Server 2000, Oracle 10g, Cognos 8 Bi, GIS, LDAP/ADSI, Web Services and Rational XDE.
- Requirements gathering and analysis at On-site. And participated in the discussions with the client at On-site at requirements phase.
- Review of Requirements documents.
- Participation in the design of Contact Center application.
- Requirements Analysis, design and development of reporting module.
- Participating in design reviews.
Course (Stream)/ Examination
Month/ Year of Passing
|Master of Computer Applications||
Anna College of Engineering
|Bachelor of Science||
Arignar Anna Arts College – Chennai, Tamil Nadu
- INFORMATICA (10+ years)
- SQL (9 years), DB2 (6 years)
- ORACLE (4 years)
- MS SQL SERVER (4 years), SPARK, Scala.
- ETL Tools: Informatica PowerCenter, Informatica PowerExchange, Data Stage 7.3,
- BI Tools: Business Objects, Cognos,
- Databases: Oracle, IBM DB2, SQL Server
- Modeling Tools: Erwin, Power Designer
- Languages: SQL, PL/SQL
- Other Tools: XML, UML, HTML, Java Script, Struts, JSTL, JDBC and Servlets.