SUPER RESUME
< Back
Edit this resume to make it your own!

Your Name

Data Architect

your.email@example.com
111-222-3333
www.your-website.com

Summary

Accomplished Big Data Developer professional offering years of Subject Matter Expertise in experience in designing and building large and complex analytical platform for Online, social, search & advertisement, CRM and financial using HADOOP, HBASE, TERADATA, ORACLE, SQL, etc. 
Architect, Design and Implement large HADOOP / HBASE Cluster to support analytics platform. Data engineering and business intelligence solutions to support ERP and CRM implementations across 7 countries. Collected Data from Structured and Unstructured Data i.e Intrader (Position/Trades, FX) Aladdin (Securities, Counterparty), Managed full lifecycle of a project from identification of exposures and flags in Intrader, Aladdin through project management with a specific focus on generating requirement from regulators/end users. 
Implement large HADOOP and HBASE Cluster for online search, advertising and analytics platform in Amazon Cloud Infrastructure.

Work Experience

Data Architect

Bank of America, Charlotte, NC

Oct 2012Current

As a Database Lead, designed Hadoop / Hive scripts, and validated process scripts for BwD V2 project. Implementation of hadoop and map-reduce clusters and applications in lieu of Teradata aggregate jobs and other ETL resource intensive jobs.
  • Develop Hive jobs and validate results against existing SQL table data.Developed map/reduce functions (such as sessionization, complex pattern matching, and pathing-related functions) in Java, enabling functionality that was prohibitively expensive to perform in SQL.
  • Worked on analyzing Hadoop stack and different bigdata analytic tools (i.e. Karmasphere, AsterData, Splunk, Datameer), migration from different databases ( i.e. Teradata, Oracle, PostgreSQL, MySQL) to Hadoop, Hadoop training, Hadoop Adminitration, NoSQL, competency.
  • Worked on Java, J2EE, JSP, Serevlet, Javascript, html, XML, Jdbc, Mysql, Struts, Hibernate, Hadoop (Map/Reduce, HBase, Hive) Apache Tomcat, Jboss, Eclipse
  • Perform POC (TERADATA, ORACLE, NETEZZA, HADOOP) for next generation active data warehouse to process more than 3 TB/day (270TB / Quarter)
  • Architect, Design and Implement large HADOOP and HBASE Cluster for online search, advertising and analytics platform in Amazon Cloud Infrastructure.
  • Architect the core infrastructure and application on Amazon Cloud.
  • Responsible for Data Architecture for structure and unstructured data, Data Modeling, Data Quality, etc
  • Build Dashboard, Scorecards, KPI, data analysis and mining.
  • Build tight collaboration with Business, Engineering, internal and external partners.

Sr. Developer

PNC Bank, Pittsburgh, PA

Nov 2010Oct 2012

Architect, design, plan and implement more than 30 Online and Corporate Data Architecture and Business Intelligence projects across 10 different countries in global matrix organization using oracle RAC, Teradata, Hadoop.
  • Worked on analyzing Hadoop stack and different bigdata analytic tools (i.e. Karmasphere, AsterData, Splunk, Datameer), migration from different databases ( i.e. Teradata, Oracle, PostgreSQL, MySQL) to Hadoop, Hadoop training, Hadoop Adminitration, NoSQL, competency.
  • Worked on Java, J2EE, JSP, Serevlet, Javascript, html, XML, Jdbc, Mysql, Struts, Hibernate, Hadoop (Map/Reduce, HBase, Hive) Apache Tomcat, Jboss, Eclipse
  • Perform POC (TERADATA, ORACLE, NETEZZA, HADOOP) for next generation active data warehouse to process more than 3 TB/day (270TB / Quarter)
  • Architect, Design and Implement large HADOOP and HBASE Cluster for online search, advertising and analytics platform in Amazon Cloud Infrastructure.
  • Architect the core infrastructure and application on Amazon Cloud.
  • Responsible for Data Architecture for structure and unstructured data, Data Modeling, Data Quality, etc
  • Build Dashboard, Scorecards, KPI, data analysis and mining.
  • Build tight collaboration with Business, Engineering, internal and external partners.

Data Scientist

Clear Trial, New York, NY

May 2009Nov 2010

Implement Data engineering and business intelligence solutions to support ERP and CRM implementations across 7 countries
  • Implement large Clustered Data Platform using TERADATA, ORACLE.
  • Proof of Concept using technology such as VERTICA, BIG DATA, etc
  • EDM, Data Modeling, Data Profiling, Data Integration.
  • Implement subscription model for Product and customer Master Data using SOA in tough and challenging business environment.
  • Managed Quality assurance and enablement across programs.

Sr. Developer

Google,

Nov 2007Aug 2009

Technical architect and Program manager for complete lifecycle of Enter data architecture and Enter Data warehouse supporting CRM, Finance, Account Management and HR using open sources tools, like Django, Rails, PHP etc.
  • POC and Roadmap for EDW, MDM technology footprint.
  • Strong focus on Python/Ruby based engines to deliver high performance.
  • Facilitated architectural decisions and roadmap development for a variety of process and technology areas, including Business Intelligence, Data Integration and Master data Management.
  • Was responsible to
  • Provide analytics support for Dashboard, KPI, Scorecard, etc.
  • Built Data Integration strategy during Merger and acquisition.

Sr. Software Engineer

Various Clients,

Jan 2003Nov 2007

Extensive experience in developing Algorithm using STL and Data Structures.
  • Extensive experience of Configuration management tools like CVS, Clear Case, and Perforce.
  • Extensive Experience of Test Strategy and Test Plan development, Verification & Validation Plan Design and Test Case preparation for Product testing.
  • Experience of Manual and Automation Testing in UNIX and Windows.
  • Experience of Unit Testing, System testing, Integration testing and UAT of Embedded applications.
  • Wrote perl, java and python scripts for various APIs
  • Worked on various platforms C/C++, Java, Shell, Perl Scripting, Visual Basic, XML, PL/SQL, Oracle 9i, Sun Solaris, Linux, Multithreading, Windows XP, MS Visual Studio 2008, MFC, .Net, Clear Case, Rational Rose, emac, Documentation Tool, Mercury QC 9.2, Quick Test pro-9.5, Bugzilla Clear Quest, TCP/IP, Multithreading, Rational Purify, Python.
  • Established in-company global consulting practice for Data Architecture and data warehouse.
  • Deliver and standardize distribution of Score Card, dashboard, KPI, several other analytical functions.
  • Help Monetize product and improve Life time value of product.
  • Product and customer Master data management.

Software Developer

Yodlee, HP, IN

Oct 2002Dec 2002

Single point of contact for Data Model, Data Design & engineering, Data Quality, etc
  • Extensive experience in C/C++, Java, Oracle PL/SQL, UNIX (HP-UX, Solaris, Linux) Programming.
  • Worked on C, XML, Unix-Shell, Perl and JavaScript Programming. MS Visual Studio 2008 with MFC implementation in C++.Extensive experience in developing C++ application in finance domain using Summit by Misys.
  • Deployed TCP/IP (Sockets), Multithreading in C++ and Java. Worked on in Object Oriented Analysis and Design using C++ and Java.

Tech Expertise
Big Data, Hadoop, Cassandra, Hbase, Mahout, Pig, Hive, Greenplum, Accumulo, Avro, ZooKeeper, Hcatalog, oozie, Flume, Whirr, Sqoop, Mrunit, Bigtop, Crunch, Giraph, OBIEE, ODI, Mongodb 2.0, Pentaho,

Additional Information

Specific skills
  • Architect the core infrastructure and application on Amazon Cloud (AWS, S3, EMR)
  • Responsible for Data Architecture for structure and unstructured data, Data Modeling, Data Quality.
  • Worked on Java/J2EE platform, developed Apache Hadoop clusters from scratch.
  • Created adhoc UI for managing various access level information
  • Monetization strategy and improving the Apache Hadoop implementation.
  • Worked on Pentaho Integration (Along with Hive) on very large unstructured db.
  • Build Dashboard, Scorecards, KPI, data analysis and mining.
  • Build tight collaboration with Business, Engineering, internal and external partners.
  • Advised CXO of big firms on disruptive solutions of next-generation technologies for Enterprise 2.0 - pragmatic strategy.
  • Baselined the SOA execution for a major insurance firm - Phased transformation and IT modernization to service portfolio.
  • Introduced cloud database and integration as a service to a big hotel chain - initiate hybrid use and enforce interoperability.
  • Defined enterprise architecture roadmap to realign IT with rating business strategy - balance strategic vision with tactics.
  • Productized cloud and big data services and solutions - spur and promote incremental adoption of new technologies.
  • Reviewed functional and non-functional system requirements and constructed a conceptual design and detailed designs that align processes and technologies to achieve challenging requirement around Liquidity Risk.
  • Pioneered the first-ever IBM public storage cloud product - develop completely disruptive offerings from the ground up.
  • Managed a cross-functional team to evolve the product line of systems services - revamp existing suites for lower TCO.
  • Strategized adaptive portfolio & product line engineering practices and operationalized effective asset-based R&D, GTM, commercialization and convergence - advise and consult big companies on IT rationalization and infrastructure optimization.
  • Drove customer-facing consulting and advisory of system services consolidation - engineer multi-dimensional simplification.
  • Devised unified cloud roadmaps with phased buildouts and work tracks - cross-disciplinary blueprint for operationalization.
  • Built a product evolution map and HERB strategy to drive disruptive offerings - Kaizen-style function consolidation.

Professional
Resume Builder | Resume Templates | Resume Examples | About Us | Privacy Policy | Terms of Use | Help Articles | Contact Us
Super-Resume.com © 2013-2017