Wednesday, June 17, 2015

Coursera and LinkedIn

I’ve taken many MOOC courses on COURSERA.ORG. To be honest I think I’ve probably completed one for every five that I’ve actually started given time constraints and other responsibilities. Without question there are a lot of very interesting subjects that can be taken by all through Coursera and they all are free (with exceptions and caveats explained below).

At present and as I understand it Coursera has two tracks when you take a class with them. One is a “Statement of Accomplishment” and the other is a “Verified Certificate.” A statement of accomplishment is free to students (kudos to Coursera), but verified certificates cost a lot of money. Coursera in the past had a mechanism where when you earned a “Statement of Accomplishment” you could post the accomplishment to your LinkedIn account. I’ve since learned from Jason Cruz from Coursera that this will no longer be an option and only people who pay for the very expensive “Verified Certificate” will be able to post to LinkedIn.

I’ve consistently posted my statements of accomplishment to LinkedIn to encourage my children, employees, and past students that there is value in being a “lifelong learner.” At my age/position I value the learning opportunities that Coursera offers, but I don’t consider them career advancing so I’m uncomfortable paying the extremely large fee for a “Verified Certificate”.

Most importantly though I’m concerned that those who are unemployed, underemployed, or are struggling to find new opportunities will be impacted by this poor decision from Coursera.

I’d invite your comments?

Friday, June 7, 2013

Experiences and Observations from My First MOOC

My organization like many in higher education is trying to determine how Massive Open Online Courses (MOOCs) can work in our existing frameworks for serving students. I volunteered to take a MOOC from start to finish and share the experience with our chancellor, board members, and anyone else in our system with an interest.  Since mine is an experience of but a single class I decided to use this blog in hopes that others may share their MOOC experiences, praises, and concerns in the comments section. So please share!

The Class: I chose to take a class through Coursera (www.coursera.org) on “Gamification” by Kevin Werbach who is an Associate Professor of Legal Studies and Business Ethics at The Wharton School, University of Pennsylvania. I very much enjoyed the class, I got a lot out of it, and I thought it was serendipitous that I should choose this class because as it progressed I saw many elements of gamification in MOOCs. For those unfamiliar with MOOCs, “massive” means many thousands can sign-up for the class and in this MOOC there were over 66K registrants. For additional background on gamification the course description can be found at:  www.coursera.org/course/gamification

The Format: I found the course design to be very straight forward, easy to understand, convenient, and similar to most online courses.  There were video lectures that were released weekly and they were supplemented by discussion forums and video office hours (Google Hangouts which due to the number in the class were limited to invitees but anyone could listen: www.youtube.com/watch?v=Y6_ONfLJ2mc).  Coursera also makes provisions for what they call “meetups” where students in the same geographical area could arrange face-to-face meetings with fellow students. Students were evaluated based upon their performance on 4 quizzes, 3 written assignments, and a final examination. The quizzes and the final examination consisted of multiple choice questions where a correct response could have multiple answers. The quizzes and the final exam were graded immediately by the Coursera system giving instant and detailed feedback on responses. With a penalty, you could retake any of the quizzes or the final for up to 5 times for each quiz or exam. If you did repeat a quiz or the exam you would be presented with new questions and/or previous questions phrased differently and response orders changed. There were two components of the class that stuck out as different from the typical online courses:

  • Interaction with the Professor – With such a large number of students it just isn’t practical for all participating to reach out and ask the professor questions.  That said, the professor, assisted by a team of teaching assistants (TAs), did monitor the discussion forums and as the discussion threads were posted they did interject where there was value in doing so.
  • Peer graded written assignments – Since it’s “massive” it’s not practical for a single professor, even when assisted by TAs to grade all written assignments.  So students are provided a grading rubric consisting of quantitative and qualitative measures for evaluating other student’s responses (requirement of evaluating 5 peers per assignment).  From my experience this is where it gets both strong and weak given the investment that the student has in the course. Participants that are motivated give great and detailed responses, others not so much. In the first grading requirement I was convinced the system was giving me responses that were poor just to test me to see if I would grade them accordingly.  In the second there were great responses that I found inspiring. In the last assignment … there were both great and mediocre responses.  

Statistics of the Class: Professor Werbach created an excellent wrap-up video of the statistics on the class that went into demographics, success rates, etc. that can be viewed at: class.coursera.org/gamification-002/lecture/170. I’d recommend watching the video, but if you're not so inclined the highlights include:

  • 66K+ registrations
  • 145 countries (U.S. 26%)
  • Average age 33 (ranging from pre-high school to 70’s & 80’s)
  • 2/3rds were male
  • 77% working or otherwise not currently in school
  • 80% have a 4-year college degree
  • 44% have an advanced degree
  • 5.6K (8.4%) successfully completed the class

The low completion rate probably sticks out to many and raises some eyebrows. To put it into perspective though, as an adjunct faculty member I taught 2-3 classes a semester for over 12 years. Considering there were anywhere from 30-45 students in each of my classes, this MOOCs 5.6K completions is more than double the number of students I ever taught in those 12 years.

Monetization/Motivation: The business model for MOOCs is obviously evolving. The system I work for recently announced (along with many others) a pilot program with Coursera (see bit.ly/16OjRZl) to test new business models and teaching methods. Nevertheless there is some revenue in the model today. At the time of my MOOC some classes, including mine, had the option of paying a fee for the “signature track.” With the signature track you had a couple of what I’ll call authentication tasks you had to do for each assignment which basically consisted of creating a biometric profile of the student’s unique typing pattern and a photo. Successful students (completing the course with a 70% or higher) who signed up for the signature track received a “verified certificate” otherwise the award was a “statement of accomplishment.” For more on the signature track see: www.coursera.org/signature/guidebook. From a revenue estimating perspective the course had 2,189 participants sign up for the signature track (≈3%). For this course the regular price of $69 was discounted to $39 which means that the signature track brought Coursera a little over $85K in revenue. How or even if any of this was shared with the professor I have no idea, but I would imagine there must be some revenue sharing (at least I would hope so). In any event the professor was able to promote his book on the subject and I’m happy to give it a plug here as well: wdp.wharton.upenn.edu/retailers/for-the-win/

Social Media and Networking: I was pleasantly surprised as to how social media came into play in this MOOC. Professor Werbach details the discussion board posts, peer assessments and Twitter post interactivity in his closing video. However not mentioned was the LinkedIn group that quickly formed and has now grown to over 455 members (many of whom have connected with one another). Posts to this group are still occurring, and professionals in the subject area that were not part of the class have joined the group and are sharing their expertise and insights on the subject as well.

Is there a MOOC in your future/past? Lifelong learners will find MOOCs to be yet another valuable avenue by which to improve their knowledge. The format is exceptionally convenient as there is no commitment upfront and if the MOOC isn’t what was expected the student can discontinue at any time. As I stated at the beginning of this blog, I’m very interested in the experiences of others with MOOCs. Please share your experiences (good and bad), motivations (for taking a MOOC or signing up for the signature track), and future plans you may have (personal or professional development you/your staff).

Thursday, April 4, 2013

Eight Consolidating, Collaborating, and Shared Services Projects as Discussed at Ellucian Live


It was an honor to be asked to participate in the 2013 Ellucian Live’s Executive Forum concurrent session on "The ROI of Consolidating ERPs and Services Across Multiple Campuses." Due to presentation time constraints, provided below are more detailed project descriptions and particulars for the 8 Tennessee Board of Regents (TBR) projects highlighted in the discussion. You can find the presentation slides at: http://slidesha.re/16uiKbk

ERP implementation (ERP): To better serve students, faculty, staff, and other constituents within the state, the TBR replaced its existing 3G administrative software with an enterprise resource planning (ERP) system in order to bring together processes, people and information. The new system, powered by Ellucian’s Banner® administrative system, offers a host of “self-service” online capabilities that had not been readily available before. This project stood apart from similar projects because of the TBR’s unique deployment approach. The multi-faceted approach included the combination of a Shared Rapid Implementation Methodology; a focus on development for the commonalities among its 19 institutions instead of differences; incorporating a third-party project oversight committee; and a creative “red/yellow/green” project reporting system. The result was a fast-tracked project that was completed on time that effectively controlled costs – with one time savings of $11-15 million. http://bit.ly/13Plapk

System Maintenance Office (SMO): The SMO was the logical result from the ERP implementation and was the TBR’s first major shared services project. Managed by and at the direction of the TBR, the SMO is a unique partnership with Ellucian that leverages both Ellucian and TBR employees to provide important services to all TBR institutions. These services include:
·         Provide system-wide tier 1 & 2 “Action Line” support for all Institutions
·         Developing and maintaining customizations/modifications
·         Supporting installation of software releases
·         Responding to institution-specific issues, such as software debugging/troubleshooting/testing
·         Supporting ongoing user functional and technical training
The SMO’s primary objective is to eliminate the duplication of software maintenance, support, and training efforts throughout the TBR’s 19 Institutions and results in $4.7-5.2M savings annually over the alternative of institutions individually providing the support. http://bit.ly/14RgXAW

DBA Collaborative (DBAC): In as similar fashion to the SMO, the DBAC is another shared services group that provides Oracle DBA support to the campuses. It differs from the SMO in that all of the core staff members are TBR employees that are collaborating with campus DBAs in support of everyone. Using this model, campuses do not have to staff as many DBAs or database technicians as they would perhaps have to but for the DBAC which results in savings of $1.8-2.8M annually. http://bit.ly/ZyuJQT

Business Intelligence (BI): Leveraging a consulting engagement at a TBR institution which identified 200+ key performance indicators (KPIs) eight TBR institutions began to collaborate on developing a KPI toolbox. This toolbox is for executives and management across TBR campuses to measure and evaluate the effectiveness of their particular area(s) of responsibility. This shared/collaborative approach will save $4-11M over individual/independent campus initiatives. http://bit.ly/MdWWtl

Banner Hosting (Hosting): Six institutions in the TBR collectively investigated hosting their Banner systems rather than operate them on their campuses. This resulted in a unique cloud collaboration with a 3rd party provider that leveraged virtualization and clustering technologies to provided real cost savings along with the benefits of Tier 3-4 hosting facilities. The original six institutions has now grown to nine and will share in cost savings of $2.3M annually over running the systems locally. http://slidesha.re/UGH2IN

E-Commerce (E-Comm): Lack of contract and supplier visibility, a want of procurement automation, the absence of data on cumulative purchases, and the repetition of the labor intensive processes of vendor registration and maintenance are just some of the many procurement challenges across a system as large as the TBR.  A combination of SciQuest products implemented system-wide provides a solution to these procurement challenges which also resulted in a 5 year cost benefit (ranging from $1.27 to $2.63 for each dollar invested.) http://slidesha.re/QxFHIL

Common Data Repository (CDR): The National Center for Higher Education Management Systems (NCHEMS) recommended to TBR the creation of a data warehouse to be used to enhance decision-making at both the system and campus levels. The strategy now referred to as the “Common Data Repository” (CDR) is to create a single authoritative data warehouse where data from TBR institutions are automatically fed into the CDR from their Banner administrative systems be they hosted or located at the campus. This single data warehouse strategy utilizes Oracle Golden Gate to transfer data to an Ellucian multi-entity processing (MEP) enabled operational data store (ODS)/enterprise data warehouse (EDW) to build the CDR and will result in annual savings of $3.4M+ over local ODS/EDW implementations.

Tennessee Summit (TN-Summit): The Tennessee Summit on Administrative Computing Technologies is an annual event that is open to higher education professionals in Tennessee and surrounding states who have adopted Banner from Ellucian. The TN-Summit was established to provide a forum for active examination of how administrative technology supports the institution, its students, faculty and staff and how this support can be improved. Its eleven thematic tracks: Accounts Receivable, Advancement, Business Intelligence, Finance, Financial Aid, Human Resources, Luminis, Student, Leadership and Management, Technical (DBA, Hardware, OS), and Technical (Application Programming) give TBR staff a conference experience while saving $572K+ annually in travel and higher registration fees. http://tnsummit.tbr.edu/

Monday, May 16, 2011

Background about the Tennessee Board of Regents “Business Process Management” Project

This week (May 17-19), some 250 people from 13 community colleges in the Tennessee Board of Regents (TBR) system will meet in Nashville (in person and by WebEx/phone) for workshops to kick-off an unprecedented business process management (BPM) program. The effort was brought about by the Complete College Tennessee Act of 2010 (CCTA) TCA 49-8-101(c), through which the TBR was tasked with merging the community colleges into a “comprehensive statewide system” of coordinated programs and services. With respect to the legislation, the BPM project has two major goals to (1) bring significant standardization as to how the institutions conduct business and interact with students in order to provide a common and pleasant experience no matter which institution they choose to attend, and (2) realize savings and efficiencies for the taxpayers through consolidation of services and overhead.

Holistically, the BPM project will provide to the TBR the most comprehensive and systematic methodology on how to improve and standardize the way in which the TBR community colleges conduct the business of education. BPM is a process-centric approach for improving performance that combines information technologies with process and governance methodologies. From a strategic standpoint the process is intended to also deliver:

§ Business “Value” – Creating value for both students and stakeholders. In addition to improved bottom-line performance and enhanced student loyalty, satisfaction, and graduation rates, BPM will indirectly facilitate the goals and objectives of the CCTA with increased innovation, improved productivity, and elevated levels of staff effectiveness and satisfaction.

§ Process “Transformation” – The re-alignment of operational processes will make them more effective, more transparent, and more agile. This will enable the solving of problems before they become major issues and reduce errors as well as detect them sooner so that they can be remedied faster.

§ Management “Enabling” – The bringing together of all of the systems, methods, tools, and techniques of process development and management into a well architected system will provide management, staff, and faculty the visibility and controls necessary for steering and fine tuning.

The Kickoff Workshop(s)

In the kickoff workshops on the 17th – 19th, all participants will first attend an “Overview of Business Process Modeling” session that will introduce the goals and objectives, terminology, methodology, technology, and how the project will be conducted. It will lay out the work ahead in the project and provide expectations, schedule, and other information for functional users that are new to BPM.

In the subsequent workshops everyone will be divided up into the six functional areas that they represent: Finance, Accounts Receivable, Human Resources & Payroll, Registration & Records/Academics, Financial Aid, and Recruitment, Admissions & Enrollment Management. In these functional workshops processes will be identified and prioritized along with discussions on their critical issues, impacts and impediments, ROI, and their strategic and/or operational significance. These groups will identify, validate, and prioritize the key enterprise processes they are responsible for so that they may be addressed in the functional workshops. Discussion will center on three key components of the business processes in their area:

· Critical Issues – what are the issues of concern within each process area related to process execution, coordination of hand-offs between processes, roles responsible for process execution, and the technology used to support the process?

· Process Impacts – why are the processes important to the various process stakeholders (students, faculty, staff, management, leadership, etc.)?

· Process Value – what value can/will be realized through the improvement of the process?

Functional Process Modeling Workshops

After the kickoff workshops have concluded, functional groups will reconvene about every other week for workshops where the business processes that were identified earlier will be collaboratively redesigned by teams consisting of members of all of the community colleges. These sessions will be consultant facilitated with the goal of modeling what the processes will look like in the future state/ideal state.

Infinity Process Platform: the BPM Technology Catalyst Tool

This project will utilize a new BPM tool from SunGard called Infinity Process Platform (IPP). This software provides out-of-the box functionality for process diagraming, modeling, execution, analysis, and integration to simplify the creation and management of business process models. IPP enables the consistent management of business processes with automation, integration, monitoring and reporting. A product overview and data sheet can be found at: http://bit.ly/ickYl8

Advice, Resources, Ideas, Suggestions?

Such an extraordinary project isn’t without substantial risk. Any advices, resources, ideas, suggestions, etc. from projects of similar size and scope that could help mitigate risk and help ensure the success of the project would be greatly appreciated.

Tuesday, April 26, 2011

Background (Q&A) about the Tennessee Board of Regents “Open Business Intelligence Initiative”

Last month at SunGard Higher Education’s annual conference or “Summit” in New Orleans a presentation was given on the Tennessee Board of Regents Open Business Intelligence (BI) Initiative http://bit.ly/hgs0YF. I’ve gotten requests for additional background information on the project which I’ve shared below. Your comments, ideas, and suggestions are always appreciated.

Description of the “Open BI Initiative.” How did it start?

This project was the result of an institutional effectiveness improvement initiative that began at Tennessee State University (TSU) and was subsequently expanded to encompass requirements for all Tennessee Board of Regents (TBR) institutions brought about by the Complete College Tennessee Act of 2010 (CCTA) TCA 49-8-101(c). The TBR system consists of six universities (including TSU), 13 community colleges, and 27 technology centers. TBR’s combined annual enrollment of over 200,000 students and a budget of $2.4 billion makes it is the nation's sixth largest system of public higher education.

Early in 2008 TSU engaged a third party consultant to identify the key performance indicators (KPIs) to track in order to improve institutional effectiveness. The process consisted of campus leadership interviews, review of institutional documents and plans (approx. 12 sources and plans including strategic and master plans), and research of external sources (SACS, NACUBO, IPEDS, THEC, NCAA, USN&WR, and other university dashboards). In the consultant’s final report to TSU (summer 2008), more than 200 KPIs were identified that crossed over 19 identified functional areas (approx. 180+ of the KPIs can be reported out of Banner). A complete list of the KPIs and the functional areas that are the focus of improvement can be found in .PDF format at: http://slidesha.re/9quuBL

In the spring of 2009, TSU and TBR entered into a partnership to develop the KPIs in the Consultant’s report by dedicating 1/5 time of 5 individuals to work with functional users to code and test the KPIs. In January of 2010, the project took on a new degree of urgency with the passage of the Complete College Tennessee Act of 2010. With CCTA, the funding formula for higher education previously based on enrollment headcounts changed to one that emphasizes student success and outcomes, including higher rates of degree completion. In addition, a zero-sum funding environment with no funding increases results in a reallocation of dollars from institutions with lower outcomes to institutions with higher student outcomes.

Based upon these developments, TBR and TSU “opened” the project up to any institution in or outside of the TBR system who wishes to participate. The only prerequisite is that the institution has to be a SunGard Higher Education Banner client that also uses ODS (Luminis preferable but not required) and the commitment is to develop 5 KPI’s every 90 days. As of this date there are now six institutions using and developing KPIs in and for the repository, with more expressing interest, making this a project that benefits multiple institutions.

What are the goals of your project?

The goals of this project have evolved over the 3+ years it has been in existence. As of this date, the six (6) major goals of the project are:

· Develop 180+ key performance indicators (KPIs), alerts, and dashboards to provide actionable intelligence on the efficiency and effectiveness of operations across 19 functional areas of focus for participating colleges and universities. These metrics are to be made available to all institutions that participate in the project so as to help institutions to improve institutional effectiveness and ­­­-- with respect to TBR institutions -- assist in better managing toward the requirements of the CCTA.

· Provide a secure and authoritative data management environment with complete and consistent data and metric algorithms that can be duplicated across the TBR enterprise or other institutions that use ODS/EDW.

· Portal/Web delivery of KPIs, alerts, dashboards, etc. to enhance institutional agility and decision making by ensuring that data is both accessible and transformed into actionable information in a timely manner.

· Enable participating campuses to push decision making down to the lowest logical level to avoid bottlenecks and allow acting on information in a timely manner.

· Attract/recruit additional institutions both in and outside of the TBR to participate in the project. BI projects are never completed and are always a work in progress. Additional talent and new perspectives can only increase the quality and quantity of valuable metrics that can be utilized by all institutions who participate to improve the effectiveness of their institution. A long term goal would be to explore with SunGard Higher Education the possibility of inclusion of the project into the SGHE Community Commons.

· Facilitate the development of “balanced scorecards” to be used by functional managers to keep track of the performance of activities by staff members within their responsibility and monitor any consequences that may arise from their actions.

What technologies are used for the project?

The project is based upon three technologies and development tools. A brief description and how they contributed to the project follow:

· Operational Data Store (ODS) – ODS is a companion product to Banner that is intended to provide a consistent view of institutional data across the enterprise for reporting purposes. Through an extract/transform/load (ETL) process, production data is extracted from Banner, transformed into “denormalized” tables, and then loaded into the ODS where it is presented to functional users using familiar business terms and definitions. This makes it easier for users to access the information they need for reporting, ad hoc queries, etc. without impacting the performance of the Banner production database. Since most universities and all community colleges in the TBR system use ODS, it was the logical choice upon which to build a major BI initiative since mapping data to KPIs is simplified by the ETL process and given the complexity of many of the KPI algorithms, performance of production Banner will not be impacted.

· Luminis – Luminis is a portal and web services environment that has been tailored to work with Banner and ODS. Again, Luminis is used by all TBR institutions making it the logical choice as the delivery mechanism for the KPIs, alerts, dashboards, etc. to executives and management alike. Its standards-based user authentication provides the level of security that was sought and delivers a consistent user experience that can be easily adapted by all participants in the project. Luminis also has the capability to build and manage communities, which is ideal for making it possible to “departmentalize” certain KPIs and build views targeted to certain constituents, such as presidents, provosts, vice presidents, etc. Lastly, the ability for users to contribute to wikis, have threaded discussions, and share resources and information was also seen as a plus for developing and documenting KPIs.

· Argos – Argos is a robust enterprise reporting software from the Evisions Corporation that is most effective for higher education reporting needs. Argos is easy to implement and easy to use for a wide variety of reporting needs -- from ad hoc reporting to advanced OLAP data cubes, as well as sophisticated dashboards and formatted reports. The Argos reporting tool contains an application program interface (API ) publishing feature that allows reporting solutions to be created and quickly shared with executive management across the campus. The Argos feature also allows end users to become self-sufficient for many of their own reporting needs, which can reduce the impact upon the information technology departments for attending to ad hoc reporting needs of their campus clients. Additionally, Argos has a lower cost of ownership than many business intelligence tools currently on the market while delivering enterprise-wide reporting solution across the campus.

It is important to note that institutions contemplating joining the initiative do not have to be Luminis or Argos users. Argos can generate SQL code that can be imported into other reporting tools, so any reporting software that can import SQL code can be used. Other portal solutions can be used as well. The strength of this project is that it identifies the data elements in Banner and ODS/EDW as well as the algorithm to calculate the KPIs.

What is innovative about the project?

TBR is utilizing a shared collaborative approach (unique for a BI initiative in higher education), where multiple colleges and universities (six as of this date) are involved in the development of the project. This approach is delivering the following overall benefits:

· Hard Cost Savings – Had this initiative gone out to a competitive bidding process, it is estimated that the development costs for a consultant led KPI development effort would be between $457K - $507K.

· Distribution of Soft Costs – The development of KPIs requires a significant investment on the part of an institution in soft costs in the form of technical and functional staff members both coding and testing the KPIs. Distributing the workload among multiple institutions ensures that no single institution is overburdened by the level of effort.

· Speed of Development – With multiple institutions working on the project, KPIs can be developed in parallel instead of serially reducing the time to completion of the project.

· KPI Quality – Involving multiple institutions in the effort enables management to identify and recruit the most talented staff members available to build out selected KPIs. E.g., an institution with exceptional staff in finance would be targeted for finance KPIs, whereas an institution with gifted staff in student affairs would likewise be requested to develop student KPIs. Additionally, with multiple institutions there are a larger number of technical and functional experts to poll and seek assistance, ideas and suggestions when issues arise, as well as a larger number of testers to assure better quality control.

A status report of the project that was originally presented to all TBR presidents and the Chancellor’s senior staff in August of last year can be found at: http://slidesha.re/gdijhx

How has the project impacted the organization?

The old management adage “You can’t manage what you don’t measure” is the fundamental crux of this project. Its intent is in delivering to the executive and management levels of functional areas across multiple institutions in the enterprise the tools necessary to measure and evaluate the effectiveness of their particular area(s) of responsibility. Unlike most software projects that only deliver on particular needs or requirements related to certain business needs or processes, this project is fundamentally at the core of leadership and management of an institution as it is all-encompassing and enables the multiple institutions that participate to address issues related to their functional investment portfolios, functional strategies, and even employee performance. Perhaps the ultimate impact has been that it is sending a message that measuring performance and outcomes has become an executive priority for the institutions, along with accountability for performance and outcomes.

What are some of the quantifiable measurements from the project?

It is important to note that at the outset of this project it was not intended to produce any particular outcome other than to provide the executives and management of participating campus a tool kit of KPIs that they could use to measure institutional effectiveness. Since the work product of this project is accessible to multiple colleges and universities, how the management on those campuses chooses to utilize the KPI toolkit can and will differ dramatically from campus to campus and their particular needs and mission.

The project has gone from zero (0) KPIs to monitor institutional effectiveness across 19 functional areas to an ever increasing inventory of KPIs (target 180+), all of which can be derived from Banner and the ODS. Interest in participating in the development of the KPIs has increased by 300%, and there is interest from additional institutions, both within and external to the TBR that will drive this number even higher. This would seem to be an indicator that the desire to better manage campus activities is becoming a priority with campus leadership, most likely due to tight budgets and legislative mandates.

A listing of the KPIs, complete with their category, descriptions, how calculated, units of measure, source, dimensions, and frequency can be found in .PDF format at: http://slidesha.re/9quuBL

Some examples of the KPIs developed can be found at: http://slidesha.re/hTLjE9

The number of KPIs by functional area are:

Access to Education - Distance Education

6

Access to Education - Financial

12

Admissions

15

Enrollment

16

Student Affairs

16

Graduation

10

Retention

3

Quality of Education

9

Program Management

3

Faculty and Staff

28

HR

2

Business & Finance

23

Development

18

Research & Sponsored Programs

13

President

6

CIT (Information Technology)

10

Facilities

4

Library

6

Athletics

11

Three examples of where KPIs are actively being used:

Graduation by Year and College within the University – For the period 2008 to 2009 the university combined graduation rates fell by 135 FTE or 7.96%. The following period 2009 to 2010 graduation rates increased by 42 FTE or 2.69%. The metrics enable college deans to track their graduation rates as compared to other colleges in the university and address issues.

New Freshman Enrollment by College – For the period 2009-10 new freshman enrollment in the College of Arts and Sciences (10.96% increase), College of Engineering (7.95% increase), and the School of Nursing (2.9% increase) indicated efforts to recruit into engineering and nursing were being successful without cannibalizing arts and sciences.

Revenue from Tuition by Term – Undergraduate out-of-state revenue, which the institution heavily relies upon, was demonstrated to be declining. Tracking revenue gains by all student types with undergraduate out-of-state student revenue growing by 2.52% and in-state students increasing by 6.34% and similar gains in graduate education demonstrated that efforts to reverse the decline were becoming successful.

Who can participate, what are the costs, and who should be contacted for additional information?

Any organization, consultancy, or vendor affiliated with SunGard Higher Education’s Banner ERP suite and ODS/EDW is welcome to join the effort. There are no costs to participate, and the only commitment is to help with the development of the KPIs that have been identified (and/or contribute additional KPIs that would be beneficial to the project).

For additional information, ideas, suggestions, and/or to join the initiative please feel free to leave a comment or send an e-mail.


Tuesday, March 8, 2011

Gauging Interest in a Multi-institutional Business Intelligence Initiative for Higher Education

Later this month SunGard Higher Education (SGHE) will have their annual conference or “Summit” in New Orleans, LA (March 20-23). I’m very much looking forward to this Summit for a couple of reasons. First we, that is the Tennessee Board of Regents (TBR) will be presenting our Open Business Intelligence (BI) Initiative. We’re calling it “open” because we don’t anticipate it will ever be completed and we want anyone in our system who shares our interest and passion in measuring institutional effectiveness to join us in this initiative. My second and more important motivation for attending though is that we want to float a trial balloon of exploring with institutions outside of the TBR their interest in joining the initiative as well.

The old business school adage “You can’t manage what you don’t measure” is the fundamental driver for this project. Its intent is to continually deliver to the executive and management levels of all functional areas across multiple institutions in the TBR enterprise the tools necessary to measure and evaluate the effectiveness of their particular area(s) of responsibility. It was derived out of a consulting engagement that identified more than 200 key performance indicators (KPIs) that crossed over 19 discrete functional areas, and is being built out of SunGard Higher Education’s Banner® ERP system and their operational data store (ODS).

We’re using a shared collaborative approach where colleagues from multiple colleges and universities (six as of this date) are involved in the development of the KPIs that have been identified in the $200K consulting engagement for the project. Since estimates for a consultant lead effort to build out the KPIs have ranged from $400K - $500K, those who join in the initiative will eventually benefit from a project that would conservatively cost any single institution somewhere in the neighborhood of $600K - $700K.

So … the purpose of this blog is to try to gauge interest in and perhaps attract/recruit additional institutions both in and outside of the TBR to participate in this Open BI project. Additional talent and new perspectives can only increase the quality and quantity of valuable metrics that can be utilized by all institutions who participate to improve the effectiveness of their institution. Initially, the thought is that participation would only require a commitment by an institution to building out at least 5 KPIs. Some briefing/background materials on the project can be found at: http://bit.ly/i9UUUc

If you will be attending the Summit please meet with our project lead on March 23rd at 11:00 AM when she will present Tennessee Board of Regents Business Intelligence Initiative (5010). If you cannot attend but are still interested in learning more please contact me and we can perhaps arrange a phone call or even a webinar for after the Summit. We look forward to providing you additional information and an opportunity to convince you to join us!

Tuesday, May 4, 2010

Why Should Vendors Help Cost Analysts with Studies Regarding Budgetary Proposals?

Recently I was working on a cost analysis project to determine the rough order of magnitude (ROM) estimates for numerous elements of a rather large software development project. The ultimate goal was to develop a budgetary proposal to seek funding for the project. ROM estimates are useful where the project is in the planning stages and requirements are not specified in great detail. My approach was to use a variation of the actual cost estimation methodology by contacting vendors to ask them to provide costing data based upon their experiences with similar projects.

Thirty-five organizations were identified and contacted with ultimately 15 responding. This represented a 43% participation rate which I was happy with. However, during the course of contacting vendors and collecting data a question came up on more than one occasion (always from organizations that I had never worked with before) which was something along the lines of: “Why should I help you with your study?” It’s a question that has come up in past costing projects, and I feel I still haven’t come up with a really compelling response. So I’ve decided to use the power of the blog to attempt to identify as many reasons as I possibly could and to seek the ideas and input of others as well. I’d greatly appreciate your comments, thoughts, and suggestions about the five reasons I’ve listed below as well as your help in identifying the many I certainly have missed …

New business leads – Chances are the consulting firm doing the budgetary study contacted you and not the other way around. When they contacted you they were working for a company that represents a potential new client to you or they wouldn’t have contacted you in the first place. Additionally, and more importantly, by assisting the firm with their study, when they do similar projects in the future they are far more likely to contact you again if you have been helpful to them in the past. Thus there is the potential for new business now and additional leads in the future.

An inside scoop on a new business opportunity – The budgetary study may or may not eventually lead to a request for proposal (RFP) opportunity. Nevertheless, should an RFP be issued you will have a lot of information far in advance of your competition who did not participate in the budgetary study. This will enable you to respond more quickly, accurately, and with far more insight into the project than if you were just seeing the requirements for the first time. Your proposal has the potential of being of better quality because of your participation.

A third party introduction to a potential new client instead of a cold call – Nearly everyone would agree that the most desirable way of gaining new clients is through an introduction by a third party that the potential new client values and respects as opposed to contacting them independently with “cold calls.” The consulting firm that is performing the budgetary study most likely has earned the trust and respect of the potential new client as demonstrated by their receiving the business to perform the study. Your providing information to the firm will typically result in your organization being cited in their report as an organization that contributed to the report by providing information, expertise, guidance, etc. This constructively serves as an introduction to the potential client and tends to indicate that the firm respects your contributions enough to include them in their report. An added benefit is that by being listed in the report, you most certainly will be on the list to receive a RFP should one be issued.

A better understanding of budgetary estimating and the potential new client – Participating in the process itself can be both educational and useful for obtaining future business. These types of studies differ from traditional RFPs and RFIs in that there are no procedural or legal obstacles to prevent you from asking as many “relevant” questions as you like. Bear in mind that the firm doing the study can only provide you information that they are aware of or is relevant to the study, but they may share their perspective on the project. After all, if it was important enough to hire a firm to do a cost study, it may very well be worth your time to learn more by participating. In any event it is a great opportunity to learn what the process is all about and develop internal processes for responding in the future, regardless if an RFP is issued or not.

The Vendor Perspective – How many times have you as a vendor asked a potential client when requested to respond to an RFI/RFP: “What’s the budget?“ The whole point of your organization being contacted to participate in this process is to seek your expertise on what that budget should be. Your opinion, coupled with the opinions of other respondents (typically anonymously with respect to their estimates) will be used to determine a realistic budget for the project. Without the cooperation of vendors it can be difficult to determine a budget that is reasonable. So if you don’t contribute … please understand you may be the problem if you’re later faced with a potential client with high expectations but not enough funding to bring them to fruition.

Please share your comments?