Contact Us Your Account
Edit this resume to make it your own!

Your Name

Program Manager/Analyst

San Jose, CA

Work Experience

Program Manager/Analyst

Cisco Systems,

Feb 2012Current

ND&M Data warehouse in ION
The ND&M data warehouse was created to Supports End-to-End Analysis of the Supply Network and feed the Core Tools for Modeling and Analytics (SCG, Tableau, LCA, etc.). Cost of Product, Manufacturing Locations, Inventory, Product, Demand/Bookings, Logistics and Yield data are the data groupings that provide an end to end view of Supply Chain. The data warehouse will also help automate Cleansing, Aggregation, and Joining of Data and also provide a historical View of Cisco's Supply Chain.
Program Manager
  • Create business requirements and validate business requirements and specifications.
  • Create and maintain the project plan including detailed schedules.
  • Responsible for identifying, tracking and managing dependencies, risks and issues.
  • Conduct project team meetings, 1:1's as required to accurately determine the project status and plan and initiate action plans.
  • Prepare project reports for management highlighting Risks, Status and achievements.
  • Responsible for updating and maintaining program and process documentation on central location.
  • Coordinate and gain approval for all changes to project requirements or plans, receiving stakeholder or leadership approval when required.
  • Perform business layer reporting and basic analysis to provide data and information to drive business decision making, monitor roll-out of new services capabilities.
Scrum Master
  • Act as Scrum Master - organizing sprint planning meetings, Sprint review meeting and Sprint retrospective; Own, maintain and update the User Story.
  • Responsible for removing obstacles/ impediments and enable the development team to deliver the sprint deliverables.
  • Aid in planning/design of team projects
  • Establish current areas of concerns with database design and issues with data structures/fields/processes
  • Determine gaps between data delivered vs. data requested.
  • Write SQL queries to determine how to best aggregate data across entities.
  • Build out system architecture models to facilitate future integration with tools used for analysis.
Environment: SQL Server 10.5, SQL, BOXI, In-house Applications (QMx, AVP, ISS)

Sr. Data warehouse Analyst

Kohls Department Stores,

May 2011Feb 2012

Householding Replacement Tool
Currently, the Kohls Department Stores Marketing uses DQXI tool to load customer data into the Corporate Data Warehouse and the CIS data mart. However, it does not have the flexibility to properly match customer data due to defects and limited functionality within the software.
The purpose of this project is to replace the current DQXI tool with a new house-holding tool (Trillium) in order to: Streamline mailings and reduce mailing expense, improve the match percentage for Kohl's Charge and other bankcards over the existing levels.

Business Analyst
  • Work with business SMEs on developing and defining the business rules for cleansing and matching.
  • Anticipate, uncover and determine root causes of data quality issues using Teradata SQL and Trillium.
  • Implement and configure rules in Trillium and Informatica to address data quality issues.
  • Analyze and Produce data quality reports using Teradata SQL and Trillium as well as work with various data sources. Present Data Cleansing Results to the Business.
  • Establish a data quality methodology documenting a repeatable set of processes for determining, investigating and resolving data quality issues. Also establish an on-going process for maintaining quality data, and defining data quality audit procedures.
  • Define and Document Cleansing Rules discovered from data cleansing and profiling.
  • Engage with vendor service support groups.
Project Manager
  • Consult with business requestors and the Project Management Office; Plan and schedule project deliverables and milestones.
  • Manage the project-level process to review and decision requests that change the project's approved requirements and specifications.
  • Execute to the project plan and oversee project personnel and their activities.
  • Facilitate regular meetings with the project team to assess status, identify new risks and issues, and ensure the team progresses towards the project goals.
Environment: Teradata 13.0, Trillium 13.0, Informatica 8.6.1, Unix, Linux.

Sr. IT Analyst/ Project Manager

Cisco Systems,

Dec 2007Oct 2010

Marketing Operational Data Store (M-ODS)
MODS is the Cisco MDM for Party Data. The purpose of the M-ODS is to provide the best quality, enterprise-wide view of existing and prospective customer contacts at any given time across source systems. M-ODS contains consolidated and standardized information that directly pertains to the individual's identity, work site, mailing and electronic addresses and preferences for communications by Cisco. This information will be selected from business rules applied to source system data and will be made available to other Cisco internal downstream systems via published output formats.
Transition Lead
  • Owned and transitioned the application successfully to production support team from the development team and streamlined globalization of Knowledge base.
  • Interface between the team in India/China, key stake holders across theatres and development team.
  • Established and set up SLA for the application.
  • Defined and established process for job recovery.
  • Stabilized the application by setting up alert systems and cut down manual monitoring time by 80%.
  • Established on-going process recovery RCA /RCF check procedure and reduce recurring processing errors.
  • Designed and Developed MODS Dashboard which showcases End to End data flow and current SLA.
  • Stabilized and Improved the PSSOT (Customer Registry (CR) service)-MODS Integration and cut down the travel time between both applications.
Application Lead
  • Maintain an in-depth knowledge of business workflows and application architectures for supported applications.
  • Excellent understanding of MDM concepts such as Duplicate Retirement, Data Cleansing/validation, Data Enrichment, Match Criteria, Party Merges, etc.
  • Ensure high data integrity by reviewing the data loaded in the data warehouse for accuracy, diagnosing data quality issues, and proposing corrective actions and methods to monitor the solution effectiveness. Data was loaded from more then 20 sources that included Teradata, Flat files, Siebel CRM, among others
  • Prepare and present documents illustrating data findings in a format that enables stake-holders to interpret the results of data analysis.
  • Define and conduct processes to ensure definitions, quality and metadata are all aligned across subject areas.
  • Partner with business to define data quality criteria & metrics for marketing data.
  • Create Test Strategy, Plan and Scenarios for functional and integration testing. Assist Business users in creating test cases and reviewing and interpreting test results in UAT.
  • Collaborate and work with Business users across different theaters. Gather requirements; create functional documents, process flows and technical documents for developers.
  • Review and sign-off code developed by engineering.
  • Good functional understanding of Siebel CRM modules such as Account, Contact, Response and Service Request.
  • Good understanding of Address Cleansing tools and Data/field parsing and Text standardization rules.
Project Manager
  • Spearheaded numerous complex projects, including updating over three hundred million contacts with correct "decision maker" value across multiple applications, updating contact's privacy options, language options.
  • Led the initiative to develop and implement "Country Contact Profile Analysis and Trends" - a service that provided a single platform to monitor the data, analyze and improve (by enrichment, natural growth of data) the available data in marketing IT systems.
  • Work with Infrastructure Demand Clearing & engage with enterprise teams like Infosec, Infrastructure, AFS etc
  • Work with IT Managers, PMs & IT Analysts for creating roadmaps, project plans, resource estimations, functional documents and communication of plans & progress to business
  • Assisted Release manager in release planning, release status/issue/risk tracking, and post-implementation issue tracking
  • Mentor and guide teams in US, China and India.

Environment: Informatica 8.6, Oracle, Trillium, Global Address, TOAD, PL/SQL, BO XI, Teradata

Data Mining and Metrics Analyst

Bank of America,

Mar 2006Jan 2007

Wave Reports Database
A part of the non-Governmental back office work was transitioned to India in three different stages called "Waves". It is important to track the status of the cases that are sent to the Wave Teams on a weekly basis to ensure that only Target work has been sent over and the cases are resolved with the shortest possible turn-around time without any lapses.
  • Gather data from different systems like Oracle database, Flat files for all three Wave Teams.
  • Prepare business requirements documents, business rules documents, process flows, data requirements documents.
  • Thorough System Analysis including data cleansing like correcting misspellings, dealing with missing data elements, and parsing into standard formats.
  • Purging selected fields from legacy data that are not useful for the data warehouse.
  • Apply different transformations on the data to calculate percentage of work completed, Target Cases completed and Non-Target cases received, Pending work etc.
  • Apply transformations to calculate cases pending and cases resolved among different categories and different LOB's.
  • Loading of the Data and Indexing - for query performance.
  • Quality Assurance checking to ensure all reported values are consistent with the time series of similar values that preceded them.
Basix Case Load
Basix is a Siebel CRM tool that is used by the GCS Service and Implementation to track Sales and Implementation tickets. This data needs to be extracted on daily basis and transferred to the Target Systems. The reports generated based on the target system data is used to calculate the number of Cases opened, Closed, open-pending cases, FCR%( First Call Resolution) This data gives the exact picture of how the different teams like Service, Sales and Implementation are performing across different sites to the Senior Leadership and the Team Mangers.
  • Excellent understanding of Customer Data.
  • Worked on requirements gathering with the Marketing team and to understand the whole business process.
  • Involved in meeting with CRM and Services team in the attribute mapping process of all the systems interacting with each other.
  • Developed the extraction process from CRM and Product and Services Application
  • Designed and developed complete transformation module.
  • Created daily, weekly and quarterly reports for the management team and field operation.
  • Data retrieval, multi-dimensional analysis.
  • Worked on complete development lifecycle of Extraction, Transformation and Loading of data.
  • Used relational sources and flat files to populate the OLAP system
  • Wrote SQL queries to get the data from the Source (Oracle and SQL Server) systems.
  • Data Cleansing and Data formatting.
Environment: Informatica PowerCenter, TOAD, Oracle 9i, SQL Server 7.0

Associate Business Analyst

Synygy India Pvt. Ltd,

Feb 2005Apr 2005

Worked as a member of the Client Services department and be involved in the implementation of
Sales Performance Management (SPM) solutions for a Global 2000 client - Eli Lilly. Synygy Quotas Management supports the entire quota-setting and management process with a centralized, structured, and objective solution to set corporate sales goals, allocate quotas, communicate them, make field adjustments, analyze performance, and tie them back to incentive compensation plans.

  • Responsible for managing entire quota-setting and adjustment process by using PL/SQL according to the Business rules for each product according to the territory.
  • Automate the process of calculating Quotas and Quota Compensation.
  • Developing and maintaining the design flows in Visio
  • Create custom goal-setting and allocation methods.
  • Use Excel, Word, Access, and Synygy Compensation software to process data, create reports, implement verification procedures, and fulfill client requests for information.
  • Prepare Test Data and perform Testing on the implemented rules.

Environment: TOAD, Oracle 9i, MS Visio, Synygy Compensation software

Technical Analyst

General Electric,

Feb 2003Mar 2005

GE Financial is a leading financial security company providing Mortgage Insurance, Life Insurance and Long Term care Insurance.GEFA was having its data (Product, Customer and Sales information) in Legacy systems and was difficult to integrate product customer and sales data across different databases to get a holistic view of its business. To accomplish this task the company integrated relational and non-relational data across different business units into enterprise data warehouse, where managers across the company can access a single repository for customer and sales information and perform ad-hoc analysis and generate analytic reports.

  • Created different Transformations for loading the data into target.
  • Worked with DBA to Create Logical and Physical models for Staging, Transition and Data Warehouse using Erwin.
  • Created Stored Procedures for data transformation purpose.
  • Scheduled sessions on the Informatica server using Informatica Workflow manager.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Created, scheduled and monitored the sessions using Informatica PowerCenter Workflow Manager.
  • Designed the ETL processes using Informatica to load data from DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Oracle Data Warehouse database.
  • Created sessions and workflow for designed mappings.
  • Redesigned some of the existing mappings in the system to meet new functionality.

Environment:: Informatica PowerCenter 6.2.2, Cognos, TOAD, Oracle 9i/8i, DB2, SQL Server 7.0, Business Objects 5.1.4.

Jr Programmer

Andhra Pradesh Technology Service (APTS),

May 2002Jan 2003

Andhra Pradesh Technology Services Limited is a wholly owned government corporation focusing on eGovernance. It provides consultancy, procurement services and implementation support to the government entities for their initiatives.
The aims of the project were computerization of all Revenue offices and integrate with the Public health system. This would enable the Government to know the financial status of each family along with the healthcare obtained by them.

  • Involved in creation of schema objects like indexes views, stored procedures and synonyms.
  • Participated in designing the database schema for the metadata for storing the informative queries that are generated dynamically.
  • Involved in writing the Triggers which internally calls procedures and functions
  • Involved in testing the database for the queries that are generated and handled the performance issues effectively.
  • Involved in documenting the technical entire process.

Environment: Oracle 8i, PL/SQL, Java, Windows NT.

Data Analyst

Astirit, San Jose, CA



jntu, San Jose, CA


Data Analysis

Additional Information

Technical Skills: 
ETL Tools Informatica Power Center /Power Mart 6.2 / 7.1 
RDBMS Oracle , MS SQL Server, Teradata, UDB DB2, MS Access. 
Data Modeling Star-Schema Modeling, Snowflake Schema Modeling, Fact, Dimension, Summary Tables, ERwin 4.0 
BI Business Objects 5i /6i / XI, Cognos 6.0, Tableau 
Languages SQL, PL/SQL, C, C++, UNIX Shell scripting, XML. 
OS WINDOWS 95/98/2000/NT, Unix. 
CRM Siebel,, Amdocs Clarify 
Others TOAD, Trillium, MS Office, D&B , Global Address, Visio , BMC Remedy, TIBCO, D&B
Resume Builder | Resume Templates | Resume Examples | About Us | Privacy Policy | Terms of Use | Help Articles | Contact Us © 2013-2018