Monday, January 14, 2019

Yes, Enrollment is Going Down. Also up.

When designing a data visualization, the first thing to ask is, "What does the viewer want to see, or need to know?"  If you're designing a dashboard for a CFO or a CEO or a VP for Marketing, those things are pretty straight forward: You're designing for one person and you have a pretty good idea what that person wants.

But in higher education, we want to look at segments of the industry, and trends that are specific to our sector.  And there are thousands of you (if this blog post is average, that is).  So I can't know.

This visualization of enrollment data measures only one thing: Enrollment.  But it measures several different types of enrollment (full-time, part-time, graduate, and undergraduate, in combination) at many different types of institutions (doctoral, baccalaureate, public, private, etc.)  And the best thing is that you can make it yours with a few clicks.

The top chart shows total headcount, and the bottom shows percentage change since the first year selected.  If you want to change the years, or change the types of enrollment, or the universe of the colleges selected, use the gray boxes at the right.  At any time, use the lavender box at top right to change the breakouts of the charts: To color by region, or grad/undergrad, or any other variable listed.

There are lots of interesting trends here, some of which will help you realize that while enrollment may be declining, it's not declining everywhere, or for every type of institution.

See something interesting? Post in the comments below.


Monday, December 10, 2018

Medical School Admissions Data

This is pretty interesting, I think, mostly for the patterns you don't see.

This is data on medical school admission in the US; some of it is compiled for a single year, and some for two years (which is OK because this data appears to be pretty stable over time.)

Tab 1 is not interactive, but does show applications, admits, and admit data on grids defined by GPA and MCAT scores.  Darker colors show higher numbers (that is, more counts, or higher admit rates.)  While we cannot get a sense of all takers like we do with other standardized tests, this does perhaps show some strong correlation between college GPA and MCAT scores (of course, another explanation may be that students self-select out, which then makes me wonder about that one student with less than a 2.0 GPA and less than a 486 Total MCAT score who applied, was admitted, and then enrolled.

The second and third tabs show applicants by undergraduate major, and ethnicity, respectively.  Choose a value at upper right (Total MCAT, or Science GPA, or Total GPA, for instance), and then compare that value for all applicants and all enrolling students on the bars; gold is applicants, and purple is enrollers.  The label only shows the value for the longer bar; hover on the other for details.

I was frankly surprised by some of these results.  How about you?


Thursday, December 6, 2018

2017 Admissions Data: First Look

IPEDS just released Fall, 2017 Admissions data, and I downloaded it and took a quick look at it. If you've been here before, most of this should be self-explanatory.

Three tabs, here: The first is to take a look at a single institution.  Use the control at top to select the college or university you're looking for. (Hint, type a few letters of the name to make scrolling quicker).

The second tab allows you to compare ten (you can do more, but it gets messy).  I started with the ten most people want to see, but you can delete them by scrolling to their check in the drop down and deleting them, and clicking apply.  Add institutions by checking the box by their name.

The final shows the relationships between test scores and Pell, which I've done before, but I never get tired of. Choose SAT or ACT calculated means for the x-axis, then limit by region and/or control if you so desire.

Notes:

1) Some of the admissions data for 2017 is tentative, so anomalies are probably in error.
2) Test-optional colleges are not allowed to report test data
3) Financial aid data is for 2016, as the 2017 data is not yet available.  It tends not to change dramatically from one year to the next, however.


Friday, November 30, 2018

The Death of History?

The last several days have seen a couple of articles about the decline of history majors in America.  How big is the problem?  And is it isolated, or across the proverbial board?

This will let you see the macro trend, and drill down all the way to a single institution, if you'd like.

The four charts, clockwise from top left are: Raw numbers of bachelor's degrees awarded from 2011-2016 (AY); percentage of total (which only makes sense when you color the bars) to show the origins of those degrees; percentage change since the first year selected; and numeric change since the first year selected.

You can color the bars by anything in the top box at right (the blue one) or just leave totals; and you can filter the results to any region, or group of years, or major group (for instance, history, or physical sciences), or even any specific institution.  And of course you can combine filters to look at Business majors in the Southeast, if you wish.

That's it.  Pretty simple.  Let me know what looks interesting here.


Wednesday, November 28, 2018

Your daily dose of "No Kidding"

As a young admissions officer in 1985, I went to my first professional conference, AACRAO, in Cincinnati. I don't remember much about it, but one session is still clear to me. I had chosen a session almost by accident, probably, because it was admissions focused in a conference that was mostly registrars. And fate stepped in.

There was a last minute substitution, and Fred Hargadon filled in for some person whose name is lost to history. At the time, I didn't think I'd stay in admissions long; my personality type is atypical for the profession, and I didn't find a lot to excite me.  But in this session I found someone who could approach the profession, well, professionally; someone who could view admissions in a much larger context than I was used to seeing.  Someone who was more intellectual and conceptual than friendly (although he was both).

I remember a lot of that session, but one thing has stuck with me through all this time.  He said, "In all my years in this profession, I've learned only two things: First, that the block on which  you were born determines where you'll end up in life more than any other factor; and second, if we had to choose the absolute worst time to put someone through the college admissions process, it would be age 17."

It was that first part that hit me.  It still does.  And here is some data that suggests things beyond your control still determine where you end up.  It's from the NCES Digest of Education Statistics, and shows what happened to students who were sophomores in high school in 2002 ten years later.

This is a pretty easy visualization to work with: The bottom bar chart shows the outcomes of the total group.  Then, using the filter at the top right, you can break out the top display by one of several values: Ethnicity (the default), gender, high school GPA, high school type, parental education, parental socioeconomic status, and the student's self-reported aspiration.  You can then see what percentage of each group has attained degrees, some education, or nothing beyond high school.  And of course, you can compare that breakout group to the total.

Use the "Highlight Outcome" function to make any particular level of education stand out.

Of course, the relationships between and among these variables are pretty clear, but the data are still telling: If you're white or Asian, if you're a female, if you were a good student in high school, if you went to a private high school, if your parents went to college, if you parents were wealthier, and if you aspired to a degree, guess what? You were more likely to get a degree.

And of course, while some of these things are a function of birth, others, like your high school GPA and your apsirations, may be heavily influenced by educated, wealthy parents.

Play around a little bit, and if you are able to find one thing on this that surprises you, let me know.


Thursday, November 1, 2018

2018 AP Scores by State and Ethnicity

The College Board data on AP scores is now available for 2018, but it's hard to make sense of in a macro sense.  The data are in 51 different workbooks, and, depending on how you want to slice and dice the data, as many as eight worksheets per workbook.  What's more, is the data structure; they're designed to print on paper, for those who want to dive into one little piece of the big picture at a time.

So before going any farther, I'd like us all to challenge the College Board and ACT to put out their data in formats that make exploring data easier for everyone. Unless, of course, they really don't want to do that.

I downloaded all 51 workbooks and extracted the actual data using EasyMorph, then pulled it into Tableau for visualization and publication. There are four views here.

The first tab is a simple scattergram, which may be enough: The relationship between a state's median income and the average AP exam score.  While blunt, it points out once again that we as a nation reward achievement in admissions (rather than merit) and that achievement is easier when you have more resources.  Filter by ethnicity or specific exam, and use the highlighters to show a state or region.

Tab two is a map, with average scores color coded.  Again, you see higher scores (orange and brown) in places where parental attainment and income are higher.  Again, two filters for you to drill down.

Tab three shows differences for any group by grade level and gender. It might be surprising to find that 11th graders generally score higher than 12th graders, until you realize that accomplished, driven children of successful parents load up on AP courses early to help with college applications.  But, given that girls have higher grades in high school than boys, you might also be surprised by the higher scores boys usually post in AP.  By the way, the young women go on to earn higher grades in college too, so wonder about that for a while.

The fourth tab shows score distributions two ways: On the left, with scores of 4 and 5 to the right, assuming 4 is generally the cutoff for college credit; since some of the groups are small (like Italian, for instance), I also put a stacked 100% bar on the right.  The Exam Groups filter at upper right clusters the tests by type (Science, Languages, etc.).

We all know that it is a good thing for students to work hard and challenge themselves in high school, but we also know--ceteris paribus--schools with more resources help prepare students for these exams better. As you look through these visualizations, I recommend you look at groups most underserved in our country, and ask whether the promise of AP has been delivered yet.

This data set is complicated as would need some explanation to manipulate, but I'll make the restructured version available to anyone in higher ed who wants it, via email to jon.boeckenstedt@depaul.edu




Monday, October 1, 2018

Story Telling With Data Challenge

I've often seen the challenges issued by Cole Knaflic on the Story Telling With Data website, and found the most recent one, creation of a scatterplot, to be too tempting to pass up. I used Tableau to create it, and yes, I've written about this before.

This is IPEDS data, from Fall of 2015 (the most recent complete set available).  It shows the strong correlation between standardized test scores and income.  And I think it shows something else, too.

On the x-axis, choose SAT or ACT scores (depending on your comfort) to see how higher scores translate into fewer lower-income students (as measured by eligibility for Pell Grants).  The bubbles are color-coded by control, and sized by selectivity (that is, the percentage of freshman applications accepted.)  Highly selective institutions are coded as larger bubbles, and less selective as smaller bubbles.

Note the cluster of private, highly selective institutions at the lower right: Most of these constitutions are among the nation's wealthiest, yet they enroll the lowest percentages of low-income students.  And, at the same time, they deny admission to the greatest numbers of students.  I presume they had many low-income students among those who were not offered admission.

Causality is complex, of course, and tests measure and vary with social capital, opportunity, and student investment as well as income and ethnicity. But this is one of those instances where a single picture tells the whole story, I think.  What about you?


Thursday, August 30, 2018

An Interactive Retention Visualization

As I've written before, I think graduation rates are mostly an input, rather than an output.  The quality of the freshman class (as measured by a single blunt variable, average test scores) predicts with pretty high certainty where your graduation rate will end up. 

(Note: Remember, the reason test optional admissions practices work is that test scores and GPA are strongly correlated.  If you didn't have a high school transcript, you could use test scores by themselves, but they would not be as good; sort of like using a screwdriver as a chisel.  And the reason why mean test scores work in this instance is essentially the same reason your stock portfolio should have 25 stocks in it to reduce non-systematic risk.)

Further, choosing students with high standardized test scores means you're likely to have taken very few risks in the admissions process, as high scores signal wealth, more accumulated educational opportunity, and college-educated parents. That essentially guarantees high grad rates.

But you can see the data for yourself, below. How to interact:

Each dot is a college, colored by control: Blue for private, orange for public. Use the filter at right to choose either one, or both.

The six-year graduation rate is on the y-axis, and mean test scores of the Fall, 2016 freshman class are along the x-axis.  Using the control at top right, you can choose SAT or ACT.  Test-optional colleges are not allowed to report scores to IPEDS.

If you want to find a college among the 1,100 or so shown, type part of the name in the "Highlight" box.  Then select from the options given.  You should be able to find it.

Sound good? There is more.

Try using the "Selectivity" filter to look at groups of colleges by selectivity.  Notice the shape of the regression lines, and how they're largely the same for each group.

Finally, if you click on an individual college, you'll find that two new charts pop up at bottom.  One shows the ethnic breakdown of the undergraduate student body; one shows all the graduation rates IPEDS collects. If you click often enough, you'll see patterns here, too. Race signals a lot, including wealth and parental attainment, as those--again--turn into graduation rates.

A final note: I've added a variable called "Chance of Four-year Graduation" which is explained here.  The premise is that everyone thinks they're going to graduate from the college they enter, so of those who do graduate, what percentage do it in four?

Tell me what you find interesting here.




Tuesday, July 17, 2018

All the 2015 Freshman Full-pays

There is no problem so great that it can't be solved by enrolling more full-pay students, it seems.  And in the minds of some, there is no solution so frequently tossed out there.  I've heard several presidents say, "We're doing this to attract more full-pay students."

Before we dive too deeply into this, a definition: A "Full-pay" student is not one who receives no aid; rather it's one who receives no institutional aid. Often these overlap considerably, but a student who receives a full Pell and/or state grant, and then takes out a PLUS loan is a full-pay; all the revenue to the college comes in cash, from another source, rather than its own financial aid funds.  The source of that cash matters not to the people who collect the tuition.  Got it?

This is a fairly deep dive into the IPEDS 2015 Fall Freshman data (there is 2016 admissions data, but financial aid data is only available for 2015-2016, so I used that admissions data to line things up.)  It's safe to say that things may have gotten slightly worse for most colleges since then, but there may be places where it's gotten better.  Discount at public institutions is less meaningful, so I've only included about 900 four-year, private, not-for-profit institutions from Doctoral, Masters, and Baccalaureate institutions with good data.

Eight views here: The first four are overviews, the next three are details within the larger context, and the final view is single institutions.  Colleges are banded into groups by selectivity in Fall, 2015, with more selective on the left, moving to the right.  Those groups are labeled "Under 15%," meaning the admit rate was under 15% in 2015; !5% to 30%, etc.  Open Admission at the right simply means the college generally admits all applicants, and is not required to report admissions data to IPEDS,

Ready? Use the tabs across the top to navigate.

1) Institutions and Full Pays: Looking colleges by selectivity, what percentage of institutions fall into each group, and what percentage of full-pay students attend.  The orange line shows that 2.45% of colleges are in the most selective group, but 14.43% of full-pays (purple line) enroll there.  Sums accumulate to the right.

2) Enrollments and Full Pay: Similar data, except now the red line shows what percentage of freshman overall are enrolled in these institutions.  For instance, 5.27% of all freshmen, but 14.43% of all full-pay students, enroll in the under 15% group.  This also shows running percentages, so by the time you get to all colleges up to and including 45% to 60%, the numbers are 73% and 81%.

 3) Freshman and Full-Pay Percentages: These are discreet.  The teal colored bar, for instance, shows only students in that category (135,381 freshmen) and the percentage of students in that group who are full-pay (4.9%).

4) Full-pay Destinations: Where do full-pay students enroll?  This shows by region and selectivity, and you can filter to a single state if you'd like.  It just shows Fall, 2015 raw numbers.

5) 6) and 7) are similar charts, with the only difference being the value displayed.  In these three, dots represent a single institution, colored by region.  They're grouped by selectivity (left to right position), and then the vertical position shows the value.  Full-pays shows the percentage of full-pays in the 2015 freshman class. Discount shows discount rate (the sum of institutional financial aid divided by the sum of tuition and fees).  Average net revenue shows just that, which is the actual cash a college generates per student.  Use the highlight function to show a single college or highlight a region for comparison.

And finally, 8) Single Institution allows you to see those three variables for one institutions at once. The are colored by region. You can sort by any column just by hovering over the axis and clicking the pop-up icon.  Sort descending by value, ascending by value, or alpha by name as you cycle through the clicks.

If your data are wrong, talk to your IR office.  If all data are wrong, drop me an email as I may have made a calculation error.  Otherwise, drop me a note and let me know what you think.


Wednesday, May 30, 2018

Measuring Internationalism in American Colleges

How International is a college?  And how do you measure it?  There are certainly a lot of ways to think about it: Location in an international city like New York, Chicago, or Los Angeles, for instance.  The extent to which the curriculum takes into account different perspectives and cultures, for another.

And, of course, there is some data, this time from the IIE Open Doors Project.  I did a simple calculation, taking the number of international students enrolled, plus the number of enrolled students studying abroad, and divided the sum of those to come up with an international index of sorts.

No, it's not precise, and yes, I know the two groups are not discreet, but this--like all the data on this blog--is designed to throw a little light on a question, not to answer it definitively.

You'll find data on all the colleges that participate in the IIE survey, displayed in four columns:  Total enrollment (on the left), International enrollment, Overseas study numbers, and the International Engagement Index, which is sort of the chance a randomly selected student will be either international or studied internationally in the last year.

The colleges are sorted by the first column, total enrollment.  You may want to see who has the most international students, or the highest International Index.  It's easy to sort these columns by hovering over the small icon near the axis label, as pictured below and indicated by the yellow arrow.  There is one for each column; give it a try, and if you get stuck, use the reset button.


As always, feel free to leave a comment below.



Thursday, May 10, 2018

Looking at Transfers

It's official: Princeton has broken its streak of not considering transfer students for admission, and has admitted 13 applicants for the Fall, 2018 term of the 1,429 who applied, for an astonishing how-low-can-you-go admit rate of 0.9%.  Of course, we'll have to wait until sometime in the future to see how many--if any--of them actually enroll.

I thought it might be interesting to take a look at transfers, so I did just that, using an IPEDS file I had on my desktop.  There are four views here, and they're pretty straightforward:

The first tab shows the number of transfers enrolled by institution in Fall, 2016 (left hand column) and the transfer ratio.  The ratio simply indicates how many new transfer students for Fall, 2016 you'd meet if you were to go on that college campus in Fall, 2016 and choose 100 students at random.  A higher number suggests a relatively more transfer friendly institution. You can choose any combination of region, control and broad Carnegie type using the filters at the top.

The second tab shows the same data arrayed on a scatter gram; type any part of a college name and then select it to see it highlighted on the chart.  Hover over a point for details.

The third chart is static, and shows undergraduate enrollment in Fall, 2016 and the number of new transfer students in the same term.  The bars are split by region and colored by Carnegie type.

And the last tab shows the weighted transfer ratios, split the same way.

As you'll see, thirteen students doesn't seem so significant against the 810,000 new transfers in Fall, 2016.  But it's a start.




Monday, May 7, 2018

Want to increase graduation rates? Enroll more students from wealthier families.

OK. Maybe the headline is misleading.  A bit.

I've written about this before: The interconnectedness of indicators for colleges success.  This is more of the same with fresher data to see if anything has changed. Spoiler alert: Not much.

What's new this time is the IPEDS publication of graduation rates for students who receive Pell and those who don't, along with overall graduation rates.  While the data are useful in aggregate to point out the trends, at the institutional level, they are not.

First, some points about the data:  I've included here colleges with at least 20 Pell-eligible freshmen in 2015, just to eliminate a lot of noise.  Colleges with small enrollments don't always have the IR staff to deliver the best data to IPEDS, and they make the reports a bit odd.  And even without these institutions, you see some issues.

Second, colleges that do not require tests for admission are not allowed to report tests in IPEDS.  Once you check "not required" that box with test scores gets grayed out, so attempting to report them is futile.

But, it's here.  View one shows pretty much every four-year public and private not-for-profit college in the US, and includes four points: On the left as dots are six-year grad rates for all students (light blue), Pell students, (dark blue) and all students (purple).  On the right is the gap between Pell grad rates and non-Pell students.  Again, some of these numbers are clearly wrong, or skewed by small numbers in spite of the exclusion noted above.

The next four collectively tell the story of wealth and access:


  • If you have more Pell students, your graduation rate is lower
  • While most colleges do a pretty good job of keeping Pell and non-Pell grad rates close, there are some troubling outliers
  • If you focus on increasing SAT scores in your freshman class, you'll pretty much assure yourself of enrolling fewer low-income students
  • But if you have higher mean freshman test scores, you'll see higher grad rates
In other words, test scores are income; income is fewer barriers to graduation.  And colleges are thus incentivized not to enroll more low-income students: It hurts important pseudo-measures of quality in the minds of the market: Mean test scores, and graduation rates.

If  you're interested on a much deeper dive on this with slightly older data, click here. Otherwise feel free to play with the visualization below.


Thursday, March 29, 2018

How have admit rates changed over time?

Parents, this one's for you.

Things are different today, or so everyone says.  If you want to see how admit rates have changed over time at any four colleges, this is your chance.  Just follow the instructions and take a look to compare how things have changed over four years.  The view starts with four similar midwestern liberal arts colleges, but you can compare any four of your choice.  (And before you ask, 2016 is the most recent data available in IPEDS).

And, a note: These changes are not all driven solely by demand.  Colleges can manipulate overall admit rates by taking a larger percentage of their class via early programs, and admit rates in those programs can be as much as 30 points higher than in regular decision.


Tuesday, March 13, 2018

Early Decision and Early Action Advantage

There is a lot of talk about admission rates, especially at the most competitive colleges and universities, and even more talk, it seems, about how much of an advantage students get by applying early, via Early Decision (ED, which is binding) or Early Action (EA, which is restrictive, but non-binding).

I license the Peterson's data set, and they break out admissions data by total, ED, and EA, and I did some calculations to create the visuals below.

Two important caveats: Some colleges clearly have people inputting the data who do not understand our terminology, who don't run data correctly, or who make a lot of typos (a -500% admission rate is probably desirable, but not possible, for instance).  Second, not every university with an EA or ED option (or any combination of them, including the different ED flavors), breaks out their data.

Start with the overall admit rate.  That's the one that gets published, and the one people think about. It's the fatter, light gray bar.  Then, the purple bar is the regular admit rate, that is, the calculated estimate of the admit rate for non-early applications (this is all applications minus all early types).  The light teal bar is the early admit rate: ED plans on the top chart, and EA plans on the bottom.  Some colleges have both, of course, but most show up only once.

You can use the filter at right to include colleges by their self-described level of admissions difficulty.

Working on another view to show the number of admits scooped up early vs. regular.  Stay tuned.  Until then, what do you notice here?  Leave a comment below.


Thursday, March 1, 2018

Tuition at State Flagships

The College Board publishes good and interesting data about college tuition, including a great table of tuition at state flagship universities. (I realized while writing this that I don't know how a university is designated a state flagship.  Maybe someone knows.)

There is some interesting stuff here, but I'll leave it for you to decide what jumps out at you: If you live in North Dakota, you might wonder why South Dakota has such low tuition for non-residents.  If you live just outside Virginia or Michigan, you might wonder why it costs so much to cross the border.

Anyway, using the tabs across the top, there are five views here:

Maps

Four maps, showing (clockwise from upper left) in-state tuition, out-of-state tuition, non-resident premium index (that is, how much extra a non-resident pays, normalized to that state's in-state tuition), and the non-resident premium in dollars.  Hover over a state for details.  You can change the year, and see the values in 2017 inflation-adjusted dollars, or nominal (non-adjusted) dollars.

States in Context

This arrays the states by tuition over time.  Use the highlight functions (go ahead, type in the box; you won't break anything) to focus on a region or a specific state. You can view resident or non-resident tuition, adjusted or non-adjusted.

Single Institution

Just what it says.  The view starts with The University of Michigan, but you can change it to any state flagship using the control at top right. Percentage increase is best viewed in 2017 adjusted dollars, of course.

Percentage Change

Shows change of in-state tuition by institution over time.  The ending value is calculated as a percentage change between the first and last years selected, so use the controls to limit the years.  Again, highlight functions put your institution in context

Non-resident Premium 

This shows how much extra non-residents pay, and trends over time.  Again, highlighter is your best friend.

Feel free to share this, of course, especially with people who are running for office in your state.

And, as always, let me know what you think.






Monday, February 26, 2018

College Board AP Data

The College Board recently released data on its AP Exams.  I've downloaded several workbooks already, and of the one I've dug into, I've only been able to get through two worksheets.  The data presentation is clunky (please, agencies, provide un-pivoted data without merged cells and totals and all that stuff, if not by itself, then as a companion), but it reveals some interesting patterns.

Well, I think so.

I've visualized it in five views: The source of the data is here, in case you want to download it yourself.

View 1, Totals (using the tabs across the top) is just totals: Use the controls to show males or females, or certain scores, or certain exams.  I think it's very compelling, especially if you look at the high scores the College Board claim about AP opening access to selective institutions.

View 2, Scores by Ethnicity and Exam, shows score distributions of the four largest ethnic groups.  Filter by a single exam if you'd like.

View 3, 100% Stacked Bars, shows the same data, presented by ethnicity.  Again, filter to a test if you'd like.

View 4, Mean Scores by Ethnicity and Exam, arrays all tests, and breaks out mean scores (yes, I know you shouldn't take averages of string variables.  So sue me).  Use the highlighter if you'd like to make any of the groups stand out visually, and filter by gender if you'd like.

View 5, Mean Scores by Gender and Exam, shows the differences between males and females. Filter to a single ethnicity if you'd like.

Tell me what you see.  Does this change your perspective on the College Board claims, or does it strengthen them?  Does it help you make up your mind?

I'd love to hear.


Wednesday, January 31, 2018

How is College Enrollment in the US Changing?

College enrollment is down.  Or maybe it's up.  Or maybe it's both.

When you read headlines, you don't get a lot of nuance. And in a country as big as ours, with such an incredible diversity of programs and widely divergent institutions, nuance is important.  So this may help do the trick.

This is enrollment data from about 6,600 post-secondary institutions in the US, and goes back as far as 1980.  It includes every institution, including those that grant degrees, and those that don't; four-year private, not-for-profits, for-profits, and publics; liberal arts colleges, research universities, and technical institutes.  All here.

It's on two dashboards.  The first shows all undergraduate and graduate enrollment at all these institutions, since 1980.  (Note: The data skips from 1980 to 1984, and I took out two years of data--1998 and 1999--because they looked a little funky.)

On the first dashboard, there are several controls to filter the data.  So for instance, if you want to look at just doctoral institutions, you can do that.  Just colleges in New England? Yes.  Only care about full-time enrollment? Just use the filter to select it.  If graduate enrollment is your interest, it's easy to get rid of the undergraduate data.  Just use the controls.  The top chart shows raw numbers, and the bottom chart shows percent change over time.  If you want a longer or shorter window, there's a control to limit the number of years.  This is especially helpful to show percent change.

Then, you can break out what ever enrollment you've selected.  Use the control titled "Color Lines By" and you can split the data shown into groups. 

Try it.  You won't break anything.  You can always reset using the little reset button at the bottom.

The second dashboard (using tabs across the top) shows similar data, but you can choose an individual college.  Once you've done so, you can limit the data shown, and you can also split it out according to your interest.

Have fun.  I've found some interesting little ditties I'll be tweeting out, and I encourage you to do the same.


Thursday, January 25, 2018

A Quick Look at the NACUBO Endowment Data

Each year NACUBO releases its study of endowment changes at about 800 colleges and universities in the US and Canada.  For this post, I'm including only those institutions in the US, and only those who reported two years of data to the survey, or about 787 institutions.

Higher Education in the US, of course, is a classic story of the haves and have nots; a few institutions near the top of the endowment food chain have amassed enormous endowments, allowing them great freedom in the programs they offer and the students they enroll. In fact, the 21 most well endowed institutions control over half, or about $280B of the $560B held overall, leaving the other 766 to divvy up the remaining $280B among them; the top 93 own 75%.

What's more interesting, I think, is the astonishing endowment growth: Stanford added $2.4B to its endowment in one year.  That amount is bigger than all but 38 of these institutions' total 2017 value.  In other words, if the gain on Stanford's endowment was an endowment, it would be the 39th largest endowment in the nation.  And in total value, it still trails Harvard by about $12B.

A couple of notes: Endowment growth is not the same as investment performance.  Some of the growth or loss can be accounted for by additions and withdrawals as well.  Second, endowments are not a big pot of money the college can spend as it wishes.  Some percentage of the income from endowments is restricted to certain programs, and often carry additional expenses the college has to come up with on its own.

Still, I think this is interesting and compelling.  Let me know what you think.




Monday, January 15, 2018

National Trends in Applicants, Admits, and Enrolls, with Draw Rates

If you read this blog regularly, you'll know I'm interested in the concept of the Draw Rate, a figure seldom used in college admissions.  Many people, when thinking about market position in higher education use selectivity or admit rate (the percentage of applicants admitted), or yield rate (the percentage of students offered admission who enroll) by themselves.

But in the market of higher education, these two variables often fight against each other. (BTW, if you object to the use of the word "market" in higher education because you think it debases our profession, see what Zemsky, Wegner, and Massy have to say about that here.)

Colleges, driven by market expectations, have for a long time tried to increase applications, believing that what the market wants is greater selectivity in the institution they choose, based on the Groucho Marx effect. Except that in order to enroll the class you want, you have to take more students when apps go up (at least in the case of the bottom 90% of colleges).  That's because your incremental applications almost certainly have a lower propensity to enroll.

So, Draw Rate (yield rate/admit rate) helps account for that.  Higher Draw Rates are generally a sign of higher market position.  Think about it mathematically: A very high numerator (high yield) coupled with a very low denominator (low admit rate) is the thing many colleges pursue.  If you pursue greater selectivity and don't account for the lower yield, you won't be in enrollment management too long.

The problem, of course, is that, in general, people who were not born 18 years ago don't apply to college.  And the number of people who will turn 18 in any given year continues to drop going forward.  So no matter how many applications each student makes, they can only go to one college next fall.

Over the past several years, the "Winner Take All" mentality has driven demand at the most selective institutions.  The need to keep up trickles down to each tier below, and the annual "We received a record number of applications for this freshman class" shtick gets old fast, even if colleges have not gotten that message yet.

The take away: Colleges have been spinning their wheels, working harder and harder to generate more applications just to stay even.  The national psychosis weighs heavily on the minds of parents and students, and they respond by hedging their bets, applying to--guess what--more colleges.  And the spiral spirals out of control.

Here are five views (using tabs across the top) to show the data.

Dashboard 1 is a high level overview of applications, admits, and enrolls at four-year public, and four-year, not-for-profit institutions (open admission institutions do not report application activity to IPEDS).  You can use the control at top to show all institutions, or just public or private.  Top view is raw numbers; bottom is percent change.

Dashboard 3, the next tab, shows the same data on bar charts, with the draw rate as a brown line hovering over the bars.  Note how it's dropped over time: This is the effect of soft applications.  You can look at any region, or any single institution if you want, but the really interesting filter is at top right: Compare colleges by their 2016 selectivity.  You see that the only institutions who have collectively increased their draw rates are exactly the ones who had the strongest market position already: The most selective colleges.  Step down from Most to Highly to Very, etc, and watch the trend on the brown line.

Next comes Dashboard 2, showing Applications per Seat in the Freshman Class, and draw rate by region.  This might explain why we in the Midwest are fascinated with the obsession with college admissions by East and West Coast media.  Y'all are welcome to come to the Midwest and chill, if you'd like.  You can use the filter to select groups of colleges by Carnegie type.

Dashboard 4 shows four key metrics to reinforce the relationship between and among them.  Again, select by 2016 Selectivity to see how they make a difference.

Finally, Dashboard 5 allows you to compare individual institutions.  I've put Harvard, Stanford, and MIT on to start, but you can choose any colleges you wish.  (I recommend no more than three or four at a time.)  To remove a college, hover over its name in the filter and X it out.  To add, type any part of the name and hit "Enter" on your keyboard.  You'll be presented with all possible matches, and just choose the ones you want.  I recommend choosing similar institutions for scaling/charting purposes.

I hope this is interesting to you; let me know what you see, and if you spot any problems.




Wednesday, January 3, 2018

Freshman Migration, 2010-2016

This is perhaps the most popular, as well as my personal favorite, post, and I'm sad that I can only do it once every two years (as the IPEDS reporting cycle only requires this data be reported bi-annually.)
This shows patterns of freshman migration within and outside of state boundaries. It's valuable to people because you can see the composition of freshman classes at colleges: Where do the students come from? You can also see patterns of state exports: Which states keep students at home, and which send them out-of-state (of course, the size and educational offerings of the various states means it's often unfair to compare, but it's still interesting.)

For this, I've limited the universe to four-year, public and private, not-for-profit institutions. Community colleges and for-profit colleges tend to have very local enrollment patterns, and high numbers of part-time students. I've also taken out institutions whose primary focus is religious training, as well as those from a few obscure Carnegie categories.

The freshmen in this analysis are only those who graduated within twelve months of enrollment in college. A word of caution: If you are afraid to click buttons and interact, stop now. This won't be of any help to you. You can't break these, and you can always reset using the controls at lower right. So click around and explore the data.

Finally, this shows the data I downloaded. Some of it is pretty clearly wrong, but that's not my problem. Contact the IR office at the offending institution and ask them what they were thinking.

So, first up: If you want to compare any four colleges on the geographic composition of their freshman classes, start here. I've added four colleges that start with "D" but you can use the controls to look at any four you want. Note: Students labeled as "in-region" are from the region, but not the state. Therefore someone "in-region" in a New Hampshire college would be from one of the five other New England states. Got it? Good. Play away on this one:



Next up: Looking at the bar charts: It's a little more complex, but you can do it.  If you want to see which colleges enroll the most (top chart) or highest percentage (bottom chart) of students from in-state, in-region, or out-of region, this is your visualization. Choose a year (it defaults to 2016), and if you wish, limit it to colleges in a region (The Southeast, for instance).  You can limit to public or private as well.  Then choose which group of students you want to explore: In-state, in-region, or out-of-region.  Again, comparing Texas to Rhode Island should only be done for the "interestingness factor," not to draw conclusions.



Here is the same data, represented on a scatter plot, in case you want to step back, and see the data all at once.  The two scales are the number of freshmen, and the percent from the region selected.



Which states export the most students, and when they export them, where do those students end up?  If you've wondered that--or if you're from Illinois or New Jersey and lament our students' mobility--this is the visualization for you.

Choose a year, and see (on the top bars, in purpley-mauve) which states exported the most students.  Then, click on a bar representing a state to see where students from that state enrolled, in the bottom chart.  If you want the college destinations to be limited to public or private, or a certain region, you can use those controls to do so.



And finally, if you're interested in which states keep students at home, you can see that, too, on this visualization. The top view looks at colleges in a state, and where their students come from; the bottom looks at students from that state, and whether they go out-of-state or stay in-state.  Again, choose a year or institutional type, if you want to look at colleges or students going to those types of colleges.



I hope you have enjoyed looking at this data as much as I have enjoyed playing with it. If you spot any errors that I've made (Tableau still has no spell check....) let me know, and I'll get to fixing them right away. Otherwise, leave a comment below with questions or observations.