Address to
the Board of Control on April 27, 2012
Chair Richardson and members of the board; distinguished audience:
We would like to update you on the various presentations and accomplishments of the Senate this year and to discuss the presidential evaluation which was conducted recently. This slide depicts the various presentations heard during the academic year including one at the bottom which was anticipated to be held when I submitted these slides on April 11th. We did hear this update on April 18th. The listing contains a wide variety of topics and also includes the address by our research award winner and those given by our teaching award winners. These talks do serve to illustrate the overall purpose of the university and many of the other talks by the university administrators were aimed primarily at the continuation of these activities, namely teaching and research.
During my last presentation here, I mentioned that we were awaiting a response from VP Reed regarding the research survey. The Senate appreciated the detailed response illustrated on this slide where we see summaries, the complete response and the decisions on some of these responses. We believe that this kind of deliberation and promotion of the research survey is very important since it clearly meant that the survey was examined and acted upon.
Our stated goal of listing reports by University Committees has met with some success and reports have been received by five of the ten committees. One committee decided to make a presentation to the Senate as we were in the process of conducting a search for volunteers. This presentation was reviewed quite nicely in our local newspaper, The Daily Mining Gazette, and this introduction will serve as a report from this committee. The Senate’s elections Committee Chair Jon Riehl conducted a search for volunteers for the committees shown here recently and we were able to fill all the vacancies. Interestingly, there was even competition for some of these positions which is a very nice development.
This year we also
investigated many proposals
and those shown on the next two slides were discussed previously. Thanks for approving Proposal 13-12
on Constitution Revision. The minor in
Global Business was approved by the administration and I note that you have
also viewed that proposal favourably. Proposal 15-12
on General Education, for which the Senate’s Curricular Policy Committee
deliberated tirelessly with their Chair Andrew Storer, no doubt consumed much
of the time spent between the Academic subsection of the Board and Provost Seel. This is indeed a very important proposal
which met with very stiff debate in the Senate and on the various email
discussion lists. Initially proposed as
a complex entity consisting of two phases, the proposal was subsequently divided
into two parts. The first part delineated
a different path for the Gen Ed program for the first two years. We then had the submission of another
proposal, 22-12, dealing with the formation of committees to investigate the
“Communication Intensive” and “Global Learning” parts of the curricula which, I
think, will constitute the Junior and Senior year Gen
Ed portions of a degree at Tech.
Proposal 21-12, on behalf of the Senate’s Professional Staff Committee, dealing
with the Senate By-Laws, is to correct the By-Laws in keeping with Proposal
13-12 and indeed was contingent upon 13-12 meeting with approval from you. In the middle, proposals 16-12 through 20-12
are in part a series of proposals from the Senate’s Instructional Policy
Committee, IPC, and various parts of the administration, namely the Graduate
School and the Registrar. Some of these address
the problems I mentioned during my last presentation here as issues that were
in need of action. I would like to
acknowledge the work of the Chair of the IPC committee, Stephen Kampe,
committee member Michael Johnson who works in the Registrar’s office, and Judi
Smigowski in the Senate’s office in writing these proposals in time for the
Senate meeting. The last proposal listed, 23-12, deserves a slide of
its own. We did have a presentation in
Senate by the Chair of the Senate’s Finance Committee Michael Mullins in which he
made the Senate aware of the pertinent issues related to Michigan Tech finances. This presentation was related to that which
VP Reed had conducted one year ago where people were given a spreadsheet and
asked to make what was essentially a Sophie’s choice between salary, tuition
and benefits. As a result of the more
recent talk and on the basis of receiving numerous suggestions as to what can
be improved, we believed that a committee should be formed with the eventual task
of compiling from the various subcommittees already in existence at Tech,
suggestions as to what can be done to accomplish the goals stated in the title
of this proposal. We note that this is
not a new idea. Indeed the University of
Illinois recently commissioned a study of their university and a special panel issued 43 recommendations
for cutting costs at the University of Illinois. They urged other university officials to just
consider these 43 ideas and see which ones would work for their
institution. Part of the impetus for
this proposal stems from our activity in the Benefits Liaison Group where
employees are asked to implement all sorts of cost saving methods to reduce our
health care expenses. The hope here is
that a similar-awareness is attained in all ventures of the university. I must confess that this proposal was not
received with any enthusiasm either by the Senate or the Administration,
interestingly, for completely different reasons. Paraphrasing the words uttered three years ago
by this year’s commencement speaker Martha Sullivan, we then looked squarely at
the facts, accepted the reality that the proposal was not going to pass and then
settled on having the Senate’s Finance Committee work on preparing a short list
of suggestions sometime next fall.
Next
I would like to comment on the presidential evaluation. As shown here, we have managed to increase the participation percentage
from 11.8% in 2007 to 45.2% this year.
The survey itself was prepared as shown here and the various people who contributed to the survey are indicated. This year we did have some difficulty when
the previously employed mechanism for distribution encountered delays in
getting going and we just decided to do the survey on the Survey Monkey for the
indicated cost. Just prior to
distribution we discovered that we needed to obtain permission from the union
representatives shown here before sending out the evaluation to their members.
This
slide depicts the participation percentage for various surveys
at Boise State University. The highest score
of 45% was on computer needs which is clearly something that affects everyone; the
45.2% we got this year can be judged on that basis. Additionally we find on this slide that the return percentage was 16% for the evaluation of
the President of the University of Michigan.
Interestingly there the lowest score was for consultations with faculty
which also scored low at Tech. This slide depicts the summary of the results obtained this year and
this was distributed to all Senators to share with their constituents shortly
after the evaluation closed. As you can
see, the responses are grouped by Faculty, Professional Staff, Union workers
and the Administration. The way by which
the data were calculated is as shown here and this is essentially the mean
score or the arithmetic average. The
data over the years indicated for the President of the University of Michigan are interesting and they
reveal similar trends to the ones we see at Tech. In short, there appears to be difficulty with
consulting with faculty and with high-level appointments. This next slide reveals in Q11 an almost identical circumstance regarding
health care and retirement benefits.
Noteworthy on this page is the fact that they conducted surveys on the
graduate school, the library and a variety of other issues. You may have noticed that they list the “Median”
result. As you are no doubt aware, the median
is, as shown on this slide, the 50th percentile or the score that divides
the distribution into halves while the mean, which is what we calculate, is the
arithmetic average derived as shown on a previous slide. According to Beiling Xiao, the standard
median, mean and the interpolated median intersect if the distribution is
symmetrical. In these cases the
mean is usually a more stable central tendency measure than the median and it
can be handled arithmetically and algebraically. If
the distribution is not symmetrical, they do not intersect and this slide shows the differences. The mean is much
more affected by the non-symmetrical or extreme values than the standard median. Any alteration of the scores of cases at the
extreme ends of a distribution will have no effect at all on the median so long
as the rank order of all the scores is roughly preserved.
A non-symmetrical distribution is usually observed with a five-point Likert scale which is
used in teaching evaluations and in this evaluation of our president. Under these conditions, the standard mean and
median may not accurately reflect the skewed distribution. In this case,
the Interpolated median (IM) which adjusts the standard median upward or downward according to the distribution of the scores provides a more
accurate measure of central tendency for skewed data. In essence, the
Interpolated Median estimates where a median would have occurred if the data
were in an analogue form instead of digitized.
Interestingly, this slide
which contains information from the University of Michigan explains why they
used the Interpolated Median. The next slide
shows how to calculate the IM.
Therefore armed with the
ability to calculate the Interpolated Median, one can see the difference in the
results in this year’s evaluation which is that some numbers increased dramatically. The point here is that these are the scores
which could be used for inter-institutional comparisons. This also raises an interesting point as to
if we should be calculating this Interpolated Median on our teaching
evaluations.
On this slide we
see a “rough” comparison with relevant data over the years. I emphasize “rough” because during the years
from 2005-2007 the nature and order of the response brackets varied and
included categories such as “Not enough information to evaluate” and
“average”. During the 2007-2009 period,
we featured many questions on health care moving from Blue Cross to Aetna and
then there was this aspect of part of the administration moving off site. Additionally, the questions have changed over
the years and some of the early data had to be inferred from hard copy bar
chart graphs. Also with last year’s
evaluation titled 2010-2011, people had to select on the evaluation whether
they were faculty, staff, union or admin.
A failure to select accurately could explain why there are not large
differences between the scores in the columns listed for faculty and staff for
2010-2011. This year, the returned
evaluations were collated into separate bins based on employee status. In essence, a similar disgruntled trend to
that obtained for President Coleman is observed with transparency in budgeting,
health care costs and hiring people within notable among the issues. Overall, while there are fluctuations in the
data, similar levels with a sinusoidal pattern are attained and some, for
example comparing the data for the areas of leadership and diversity in 2006 to
that obtained most recently have improved.
The University went through difficult times at the start of this
evaluation process, attained stability and is now engaged in transformational
phases outlined in the Strategic Plan.
Such difficult and lofty goals exact a price on the evaluation process
since change is not usually received kindly, especially by people secured with
tenure.
Finally, the fact that the return percentage has increased is worth
contemplating. One might be tempted to
conclude that an increase signifies dissatisfaction with the leader. I do not believe this to be the case since
the scores look similar to those obtained over the years. Rather I think it signifies confidence and
trust in the system. I would
like to think that this confidence and trust was attained in some small way out
of the various agreements that the Provost and the Senate were able to accomplish
during the last three years. In his
address to the Senate on April 4th, President Mroz
also attributed this high return percentage to better communication resulting
in part from the actions of the Provost.
This confidence and trust feeling takes a long time to enhance since it
is based on actions and perceptions and unfortunately, sadly, there is an
evanescent quality to these.
Thanks
for listening.