Thursday, November 1, 2018

2018 AP Scores by State and Ethnicity

The College Board data on AP scores is now available for 2018, but it's hard to make sense of in a macro sense.  The data are in 51 different workbooks, and, depending on how you want to slice and dice the data, as many as eight worksheets per workbook.  What's more, is the data structure; they're designed to print on paper, for those who want to dive into one little piece of the big picture at a time.

So before going any farther, I'd like us all to challenge the College Board and ACT to put out their data in formats that make exploring data easier for everyone. Unless, of course, they really don't want to do that.

I downloaded all 51 workbooks and extracted the actual data using EasyMorph, then pulled it into Tableau for visualization and publication. There are four views here.

The first tab is a simple scattergram, which may be enough: The relationship between a state's median income and the average AP exam score.  While blunt, it points out once again that we as a nation reward achievement in admissions (rather than merit) and that achievement is easier when you have more resources.  Filter by ethnicity or specific exam, and use the highlighters to show a state or region.

Tab two is a map, with average scores color coded.  Again, you see higher scores (orange and brown) in places where parental attainment and income are higher.  Again, two filters for you to drill down.

Tab three shows differences for any group by grade level and gender. It might be surprising to find that 11th graders generally score higher than 12th graders, until you realize that accomplished, driven children of successful parents load up on AP courses early to help with college applications.  But, given that girls have higher grades in high school than boys, you might also be surprised by the higher scores boys usually post in AP.  By the way, the young women go on to earn higher grades in college too, so wonder about that for a while.

The fourth tab shows score distributions two ways: On the left, with scores of 4 and 5 to the right, assuming 4 is generally the cutoff for college credit; since some of the groups are small (like Italian, for instance), I also put a stacked 100% bar on the right.  The Exam Groups filter at upper right clusters the tests by type (Science, Languages, etc.).

We all know that it is a good thing for students to work hard and challenge themselves in high school, but we also know--ceteris paribus--schools with more resources help prepare students for these exams better. As you look through these visualizations, I recommend you look at groups most underserved in our country, and ask whether the promise of AP has been delivered yet.

This data set is complicated as would need some explanation to manipulate, but I'll make the restructured version available to anyone in higher ed who wants it, via email to jon.boeckenstedt@depaul.edu




Monday, October 1, 2018

Story Telling With Data Challenge

I've often seen the challenges issued by Cole Knaflic on the Story Telling With Data website, and found the most recent one, creation of a scatterplot, to be too tempting to pass up. I used Tableau to create it, and yes, I've written about this before.

This is IPEDS data, from Fall of 2015 (the most recent complete set available).  It shows the strong correlation between standardized test scores and income.  And I think it shows something else, too.

On the x-axis, choose SAT or ACT scores (depending on your comfort) to see how higher scores translate into fewer lower-income students (as measured by eligibility for Pell Grants).  The bubbles are color-coded by control, and sized by selectivity (that is, the percentage of freshman applications accepted.)  Highly selective institutions are coded as larger bubbles, and less selective as smaller bubbles.

Note the cluster of private, highly selective institutions at the lower right: Most of these constitutions are among the nation's wealthiest, yet they enroll the lowest percentages of low-income students.  And, at the same time, they deny admission to the greatest numbers of students.  I presume they had many low-income students among those who were not offered admission.

Causality is complex, of course, and tests measure and vary with social capital, opportunity, and student investment as well as income and ethnicity. But this is one of those instances where a single picture tells the whole story, I think.  What about you?


Thursday, August 30, 2018

An Interactive Retention Visualization

As I've written before, I think graduation rates are mostly an input, rather than an output.  The quality of the freshman class (as measured by a single blunt variable, average test scores) predicts with pretty high certainty where your graduation rate will end up. 

(Note: Remember, the reason test optional admissions practices work is that test scores and GPA are strongly correlated.  If you didn't have a high school transcript, you could use test scores by themselves, but they would not be as good; sort of like using a screwdriver as a chisel.  And the reason why mean test scores work in this instance is essentially the same reason your stock portfolio should have 25 stocks in it to reduce non-systematic risk.)

Further, choosing students with high standardized test scores means you're likely to have taken very few risks in the admissions process, as high scores signal wealth, more accumulated educational opportunity, and college-educated parents. That essentially guarantees high grad rates.

But you can see the data for yourself, below. How to interact:

Each dot is a college, colored by control: Blue for private, orange for public. Use the filter at right to choose either one, or both.

The six-year graduation rate is on the y-axis, and mean test scores of the Fall, 2016 freshman class are along the x-axis.  Using the control at top right, you can choose SAT or ACT.  Test-optional colleges are not allowed to report scores to IPEDS.

If you want to find a college among the 1,100 or so shown, type part of the name in the "Highlight" box.  Then select from the options given.  You should be able to find it.

Sound good? There is more.

Try using the "Selectivity" filter to look at groups of colleges by selectivity.  Notice the shape of the regression lines, and how they're largely the same for each group.

Finally, if you click on an individual college, you'll find that two new charts pop up at bottom.  One shows the ethnic breakdown of the undergraduate student body; one shows all the graduation rates IPEDS collects. If you click often enough, you'll see patterns here, too. Race signals a lot, including wealth and parental attainment, as those--again--turn into graduation rates.

A final note: I've added a variable called "Chance of Four-year Graduation" which is explained here.  The premise is that everyone thinks they're going to graduate from the college they enter, so of those who do graduate, what percentage do it in four?

Tell me what you find interesting here.




Tuesday, July 17, 2018

All the 2015 Freshman Full-pays

There is no problem so great that it can't be solved by enrolling more full-pay students, it seems.  And in the minds of some, there is no solution so frequently tossed out there.  I've heard several presidents say, "We're doing this to attract more full-pay students."

Before we dive too deeply into this, a definition: A "Full-pay" student is not one who receives no aid; rather it's one who receives no institutional aid. Often these overlap considerably, but a student who receives a full Pell and/or state grant, and then takes out a PLUS loan is a full-pay; all the revenue to the college comes in cash, from another source, rather than its own financial aid funds.  The source of that cash matters not to the people who collect the tuition.  Got it?

This is a fairly deep dive into the IPEDS 2015 Fall Freshman data (there is 2016 admissions data, but financial aid data is only available for 2015-2016, so I used that admissions data to line things up.)  It's safe to say that things may have gotten slightly worse for most colleges since then, but there may be places where it's gotten better.  Discount at public institutions is less meaningful, so I've only included about 900 four-year, private, not-for-profit institutions from Doctoral, Masters, and Baccalaureate institutions with good data.

Eight views here: The first four are overviews, the next three are details within the larger context, and the final view is single institutions.  Colleges are banded into groups by selectivity in Fall, 2015, with more selective on the left, moving to the right.  Those groups are labeled "Under 15%," meaning the admit rate was under 15% in 2015; !5% to 30%, etc.  Open Admission at the right simply means the college generally admits all applicants, and is not required to report admissions data to IPEDS,

Ready? Use the tabs across the top to navigate.

1) Institutions and Full Pays: Looking colleges by selectivity, what percentage of institutions fall into each group, and what percentage of full-pay students attend.  The orange line shows that 2.45% of colleges are in the most selective group, but 14.43% of full-pays (purple line) enroll there.  Sums accumulate to the right.

2) Enrollments and Full Pay: Similar data, except now the red line shows what percentage of freshman overall are enrolled in these institutions.  For instance, 5.27% of all freshmen, but 14.43% of all full-pay students, enroll in the under 15% group.  This also shows running percentages, so by the time you get to all colleges up to and including 45% to 60%, the numbers are 73% and 81%.

 3) Freshman and Full-Pay Percentages: These are discreet.  The teal colored bar, for instance, shows only students in that category (135,381 freshmen) and the percentage of students in that group who are full-pay (4.9%).

4) Full-pay Destinations: Where do full-pay students enroll?  This shows by region and selectivity, and you can filter to a single state if you'd like.  It just shows Fall, 2015 raw numbers.

5) 6) and 7) are similar charts, with the only difference being the value displayed.  In these three, dots represent a single institution, colored by region.  They're grouped by selectivity (left to right position), and then the vertical position shows the value.  Full-pays shows the percentage of full-pays in the 2015 freshman class. Discount shows discount rate (the sum of institutional financial aid divided by the sum of tuition and fees).  Average net revenue shows just that, which is the actual cash a college generates per student.  Use the highlight function to show a single college or highlight a region for comparison.

And finally, 8) Single Institution allows you to see those three variables for one institutions at once. The are colored by region. You can sort by any column just by hovering over the axis and clicking the pop-up icon.  Sort descending by value, ascending by value, or alpha by name as you cycle through the clicks.

If your data are wrong, talk to your IR office.  If all data are wrong, drop me an email as I may have made a calculation error.  Otherwise, drop me a note and let me know what you think.


Wednesday, May 30, 2018

Measuring Internationalism in American Colleges

How International is a college?  And how do you measure it?  There are certainly a lot of ways to think about it: Location in an international city like New York, Chicago, or Los Angeles, for instance.  The extent to which the curriculum takes into account different perspectives and cultures, for another.

And, of course, there is some data, this time from the IIE Open Doors Project.  I did a simple calculation, taking the number of international students enrolled, plus the number of enrolled students studying abroad, and divided the sum of those to come up with an international index of sorts.

No, it's not precise, and yes, I know the two groups are not discreet, but this--like all the data on this blog--is designed to throw a little light on a question, not to answer it definitively.

You'll find data on all the colleges that participate in the IIE survey, displayed in four columns:  Total enrollment (on the left), International enrollment, Overseas study numbers, and the International Engagement Index, which is sort of the chance a randomly selected student will be either international or studied internationally in the last year.

The colleges are sorted by the first column, total enrollment.  You may want to see who has the most international students, or the highest International Index.  It's easy to sort these columns by hovering over the small icon near the axis label, as pictured below and indicated by the yellow arrow.  There is one for each column; give it a try, and if you get stuck, use the reset button.


As always, feel free to leave a comment below.



Thursday, May 10, 2018

Looking at Transfers

It's official: Princeton has broken its streak of not considering transfer students for admission, and has admitted 13 applicants for the Fall, 2018 term of the 1,429 who applied, for an astonishing how-low-can-you-go admit rate of 0.9%.  Of course, we'll have to wait until sometime in the future to see how many--if any--of them actually enroll.

I thought it might be interesting to take a look at transfers, so I did just that, using an IPEDS file I had on my desktop.  There are four views here, and they're pretty straightforward:

The first tab shows the number of transfers enrolled by institution in Fall, 2016 (left hand column) and the transfer ratio.  The ratio simply indicates how many new transfer students for Fall, 2016 you'd meet if you were to go on that college campus in Fall, 2016 and choose 100 students at random.  A higher number suggests a relatively more transfer friendly institution. You can choose any combination of region, control and broad Carnegie type using the filters at the top.

The second tab shows the same data arrayed on a scatter gram; type any part of a college name and then select it to see it highlighted on the chart.  Hover over a point for details.

The third chart is static, and shows undergraduate enrollment in Fall, 2016 and the number of new transfer students in the same term.  The bars are split by region and colored by Carnegie type.

And the last tab shows the weighted transfer ratios, split the same way.

As you'll see, thirteen students doesn't seem so significant against the 810,000 new transfers in Fall, 2016.  But it's a start.




Monday, May 7, 2018

Want to increase graduation rates? Enroll more students from wealthier families.

OK. Maybe the headline is misleading.  A bit.

I've written about this before: The interconnectedness of indicators for colleges success.  This is more of the same with fresher data to see if anything has changed. Spoiler alert: Not much.

What's new this time is the IPEDS publication of graduation rates for students who receive Pell and those who don't, along with overall graduation rates.  While the data are useful in aggregate to point out the trends, at the institutional level, they are not.

First, some points about the data:  I've included here colleges with at least 20 Pell-eligible freshmen in 2015, just to eliminate a lot of noise.  Colleges with small enrollments don't always have the IR staff to deliver the best data to IPEDS, and they make the reports a bit odd.  And even without these institutions, you see some issues.

Second, colleges that do not require tests for admission are not allowed to report tests in IPEDS.  Once you check "not required" that box with test scores gets grayed out, so attempting to report them is futile.

But, it's here.  View one shows pretty much every four-year public and private not-for-profit college in the US, and includes four points: On the left as dots are six-year grad rates for all students (light blue), Pell students, (dark blue) and all students (purple).  On the right is the gap between Pell grad rates and non-Pell students.  Again, some of these numbers are clearly wrong, or skewed by small numbers in spite of the exclusion noted above.

The next four collectively tell the story of wealth and access:


  • If you have more Pell students, your graduation rate is lower
  • While most colleges do a pretty good job of keeping Pell and non-Pell grad rates close, there are some troubling outliers
  • If you focus on increasing SAT scores in your freshman class, you'll pretty much assure yourself of enrolling fewer low-income students
  • But if you have higher mean freshman test scores, you'll see higher grad rates
In other words, test scores are income; income is fewer barriers to graduation.  And colleges are thus incentivized not to enroll more low-income students: It hurts important pseudo-measures of quality in the minds of the market: Mean test scores, and graduation rates.

If  you're interested on a much deeper dive on this with slightly older data, click here. Otherwise feel free to play with the visualization below.


Thursday, March 29, 2018

How have admit rates changed over time?

Parents, this one's for you.

Things are different today, or so everyone says.  If you want to see how admit rates have changed over time at any four colleges, this is your chance.  Just follow the instructions and take a look to compare how things have changed over four years.  The view starts with four similar midwestern liberal arts colleges, but you can compare any four of your choice.  (And before you ask, 2016 is the most recent data available in IPEDS).

And, a note: These changes are not all driven solely by demand.  Colleges can manipulate overall admit rates by taking a larger percentage of their class via early programs, and admit rates in those programs can be as much as 30 points higher than in regular decision.


Tuesday, March 13, 2018

Early Decision and Early Action Advantage

There is a lot of talk about admission rates, especially at the most competitive colleges and universities, and even more talk, it seems, about how much of an advantage students get by applying early, via Early Decision (ED, which is binding) or Early Action (EA, which is restrictive, but non-binding).

I license the Peterson's data set, and they break out admissions data by total, ED, and EA, and I did some calculations to create the visuals below.

Two important caveats: Some colleges clearly have people inputting the data who do not understand our terminology, who don't run data correctly, or who make a lot of typos (a -500% admission rate is probably desirable, but not possible, for instance).  Second, not every university with an EA or ED option (or any combination of them, including the different ED flavors), breaks out their data.

Start with the overall admit rate.  That's the one that gets published, and the one people think about. It's the fatter, light gray bar.  Then, the purple bar is the regular admit rate, that is, the calculated estimate of the admit rate for non-early applications (this is all applications minus all early types).  The light teal bar is the early admit rate: ED plans on the top chart, and EA plans on the bottom.  Some colleges have both, of course, but most show up only once.

You can use the filter at right to include colleges by their self-described level of admissions difficulty.

Working on another view to show the number of admits scooped up early vs. regular.  Stay tuned.  Until then, what do you notice here?  Leave a comment below.


Thursday, March 1, 2018

Tuition at State Flagships

The College Board publishes good and interesting data about college tuition, including a great table of tuition at state flagship universities. (I realized while writing this that I don't know how a university is designated a state flagship.  Maybe someone knows.)

There is some interesting stuff here, but I'll leave it for you to decide what jumps out at you: If you live in North Dakota, you might wonder why South Dakota has such low tuition for non-residents.  If you live just outside Virginia or Michigan, you might wonder why it costs so much to cross the border.

Anyway, using the tabs across the top, there are five views here:

Maps

Four maps, showing (clockwise from upper left) in-state tuition, out-of-state tuition, non-resident premium index (that is, how much extra a non-resident pays, normalized to that state's in-state tuition), and the non-resident premium in dollars.  Hover over a state for details.  You can change the year, and see the values in 2017 inflation-adjusted dollars, or nominal (non-adjusted) dollars.

States in Context

This arrays the states by tuition over time.  Use the highlight functions (go ahead, type in the box; you won't break anything) to focus on a region or a specific state. You can view resident or non-resident tuition, adjusted or non-adjusted.

Single Institution

Just what it says.  The view starts with The University of Michigan, but you can change it to any state flagship using the control at top right. Percentage increase is best viewed in 2017 adjusted dollars, of course.

Percentage Change

Shows change of in-state tuition by institution over time.  The ending value is calculated as a percentage change between the first and last years selected, so use the controls to limit the years.  Again, highlight functions put your institution in context

Non-resident Premium 

This shows how much extra non-residents pay, and trends over time.  Again, highlighter is your best friend.

Feel free to share this, of course, especially with people who are running for office in your state.

And, as always, let me know what you think.






Monday, February 26, 2018

College Board AP Data

The College Board recently released data on its AP Exams.  I've downloaded several workbooks already, and of the one I've dug into, I've only been able to get through two worksheets.  The data presentation is clunky (please, agencies, provide un-pivoted data without merged cells and totals and all that stuff, if not by itself, then as a companion), but it reveals some interesting patterns.

Well, I think so.

I've visualized it in five views: The source of the data is here, in case you want to download it yourself.

View 1, Totals (using the tabs across the top) is just totals: Use the controls to show males or females, or certain scores, or certain exams.  I think it's very compelling, especially if you look at the high scores the College Board claim about AP opening access to selective institutions.

View 2, Scores by Ethnicity and Exam, shows score distributions of the four largest ethnic groups.  Filter by a single exam if you'd like.

View 3, 100% Stacked Bars, shows the same data, presented by ethnicity.  Again, filter to a test if you'd like.

View 4, Mean Scores by Ethnicity and Exam, arrays all tests, and breaks out mean scores (yes, I know you shouldn't take averages of string variables.  So sue me).  Use the highlighter if you'd like to make any of the groups stand out visually, and filter by gender if you'd like.

View 5, Mean Scores by Gender and Exam, shows the differences between males and females. Filter to a single ethnicity if you'd like.

Tell me what you see.  Does this change your perspective on the College Board claims, or does it strengthen them?  Does it help you make up your mind?

I'd love to hear.


Wednesday, January 31, 2018

How is College Enrollment in the US Changing?

College enrollment is down.  Or maybe it's up.  Or maybe it's both.

When you read headlines, you don't get a lot of nuance. And in a country as big as ours, with such an incredible diversity of programs and widely divergent institutions, nuance is important.  So this may help do the trick.

This is enrollment data from about 6,600 post-secondary institutions in the US, and goes back as far as 1980.  It includes every institution, including those that grant degrees, and those that don't; four-year private, not-for-profits, for-profits, and publics; liberal arts colleges, research universities, and technical institutes.  All here.

It's on two dashboards.  The first shows all undergraduate and graduate enrollment at all these institutions, since 1980.  (Note: The data skips from 1980 to 1984, and I took out two years of data--1998 and 1999--because they looked a little funky.)

On the first dashboard, there are several controls to filter the data.  So for instance, if you want to look at just doctoral institutions, you can do that.  Just colleges in New England? Yes.  Only care about full-time enrollment? Just use the filter to select it.  If graduate enrollment is your interest, it's easy to get rid of the undergraduate data.  Just use the controls.  The top chart shows raw numbers, and the bottom chart shows percent change over time.  If you want a longer or shorter window, there's a control to limit the number of years.  This is especially helpful to show percent change.

Then, you can break out what ever enrollment you've selected.  Use the control titled "Color Lines By" and you can split the data shown into groups. 

Try it.  You won't break anything.  You can always reset using the little reset button at the bottom.

The second dashboard (using tabs across the top) shows similar data, but you can choose an individual college.  Once you've done so, you can limit the data shown, and you can also split it out according to your interest.

Have fun.  I've found some interesting little ditties I'll be tweeting out, and I encourage you to do the same.


Thursday, January 25, 2018

A Quick Look at the NACUBO Endowment Data

Each year NACUBO releases its study of endowment changes at about 800 colleges and universities in the US and Canada.  For this post, I'm including only those institutions in the US, and only those who reported two years of data to the survey, or about 787 institutions.

Higher Education in the US, of course, is a classic story of the haves and have nots; a few institutions near the top of the endowment food chain have amassed enormous endowments, allowing them great freedom in the programs they offer and the students they enroll. In fact, the 21 most well endowed institutions control over half, or about $280B of the $560B held overall, leaving the other 766 to divvy up the remaining $280B among them; the top 93 own 75%.

What's more interesting, I think, is the astonishing endowment growth: Stanford added $2.4B to its endowment in one year.  That amount is bigger than all but 38 of these institutions' total 2017 value.  In other words, if the gain on Stanford's endowment was an endowment, it would be the 39th largest endowment in the nation.  And in total value, it still trails Harvard by about $12B.

A couple of notes: Endowment growth is not the same as investment performance.  Some of the growth or loss can be accounted for by additions and withdrawals as well.  Second, endowments are not a big pot of money the college can spend as it wishes.  Some percentage of the income from endowments is restricted to certain programs, and often carry additional expenses the college has to come up with on its own.

Still, I think this is interesting and compelling.  Let me know what you think.




Monday, January 15, 2018

National Trends in Applicants, Admits, and Enrolls, with Draw Rates

If you read this blog regularly, you'll know I'm interested in the concept of the Draw Rate, a figure seldom used in college admissions.  Many people, when thinking about market position in higher education use selectivity or admit rate (the percentage of applicants admitted), or yield rate (the percentage of students offered admission who enroll) by themselves.

But in the market of higher education, these two variables often fight against each other. (BTW, if you object to the use of the word "market" in higher education because you think it debases our profession, see what Zemsky, Wegner, and Massy have to say about that here.)

Colleges, driven by market expectations, have for a long time tried to increase applications, believing that what the market wants is greater selectivity in the institution they choose, based on the Groucho Marx effect. Except that in order to enroll the class you want, you have to take more students when apps go up (at least in the case of the bottom 90% of colleges).  That's because your incremental applications almost certainly have a lower propensity to enroll.

So, Draw Rate (yield rate/admit rate) helps account for that.  Higher Draw Rates are generally a sign of higher market position.  Think about it mathematically: A very high numerator (high yield) coupled with a very low denominator (low admit rate) is the thing many colleges pursue.  If you pursue greater selectivity and don't account for the lower yield, you won't be in enrollment management too long.

The problem, of course, is that, in general, people who were not born 18 years ago don't apply to college.  And the number of people who will turn 18 in any given year continues to drop going forward.  So no matter how many applications each student makes, they can only go to one college next fall.

Over the past several years, the "Winner Take All" mentality has driven demand at the most selective institutions.  The need to keep up trickles down to each tier below, and the annual "We received a record number of applications for this freshman class" shtick gets old fast, even if colleges have not gotten that message yet.

The take away: Colleges have been spinning their wheels, working harder and harder to generate more applications just to stay even.  The national psychosis weighs heavily on the minds of parents and students, and they respond by hedging their bets, applying to--guess what--more colleges.  And the spiral spirals out of control.

Here are five views (using tabs across the top) to show the data.

Dashboard 1 is a high level overview of applications, admits, and enrolls at four-year public, and four-year, not-for-profit institutions (open admission institutions do not report application activity to IPEDS).  You can use the control at top to show all institutions, or just public or private.  Top view is raw numbers; bottom is percent change.

Dashboard 3, the next tab, shows the same data on bar charts, with the draw rate as a brown line hovering over the bars.  Note how it's dropped over time: This is the effect of soft applications.  You can look at any region, or any single institution if you want, but the really interesting filter is at top right: Compare colleges by their 2016 selectivity.  You see that the only institutions who have collectively increased their draw rates are exactly the ones who had the strongest market position already: The most selective colleges.  Step down from Most to Highly to Very, etc, and watch the trend on the brown line.

Next comes Dashboard 2, showing Applications per Seat in the Freshman Class, and draw rate by region.  This might explain why we in the Midwest are fascinated with the obsession with college admissions by East and West Coast media.  Y'all are welcome to come to the Midwest and chill, if you'd like.  You can use the filter to select groups of colleges by Carnegie type.

Dashboard 4 shows four key metrics to reinforce the relationship between and among them.  Again, select by 2016 Selectivity to see how they make a difference.

Finally, Dashboard 5 allows you to compare individual institutions.  I've put Harvard, Stanford, and MIT on to start, but you can choose any colleges you wish.  (I recommend no more than three or four at a time.)  To remove a college, hover over its name in the filter and X it out.  To add, type any part of the name and hit "Enter" on your keyboard.  You'll be presented with all possible matches, and just choose the ones you want.  I recommend choosing similar institutions for scaling/charting purposes.

I hope this is interesting to you; let me know what you see, and if you spot any problems.




Wednesday, January 3, 2018

Freshman Migration, 2010-2016

This is perhaps the most popular, as well as my personal favorite, post, and I'm sad that I can only do it once every two years (as the IPEDS reporting cycle only requires this data be reported bi-annually.)
This shows patterns of freshman migration within and outside of state boundaries. It's valuable to people because you can see the composition of freshman classes at colleges: Where do the students come from? You can also see patterns of state exports: Which states keep students at home, and which send them out-of-state (of course, the size and educational offerings of the various states means it's often unfair to compare, but it's still interesting.)

For this, I've limited the universe to four-year, public and private, not-for-profit institutions. Community colleges and for-profit colleges tend to have very local enrollment patterns, and high numbers of part-time students. I've also taken out institutions whose primary focus is religious training, as well as those from a few obscure Carnegie categories.

The freshmen in this analysis are only those who graduated within twelve months of enrollment in college. A word of caution: If you are afraid to click buttons and interact, stop now. This won't be of any help to you. You can't break these, and you can always reset using the controls at lower right. So click around and explore the data.

Finally, this shows the data I downloaded. Some of it is pretty clearly wrong, but that's not my problem. Contact the IR office at the offending institution and ask them what they were thinking.

So, first up: If you want to compare any four colleges on the geographic composition of their freshman classes, start here. I've added four colleges that start with "D" but you can use the controls to look at any four you want. Note: Students labeled as "in-region" are from the region, but not the state. Therefore someone "in-region" in a New Hampshire college would be from one of the five other New England states. Got it? Good. Play away on this one:



Next up: Looking at the bar charts: It's a little more complex, but you can do it.  If you want to see which colleges enroll the most (top chart) or highest percentage (bottom chart) of students from in-state, in-region, or out-of region, this is your visualization. Choose a year (it defaults to 2016), and if you wish, limit it to colleges in a region (The Southeast, for instance).  You can limit to public or private as well.  Then choose which group of students you want to explore: In-state, in-region, or out-of-region.  Again, comparing Texas to Rhode Island should only be done for the "interestingness factor," not to draw conclusions.



Here is the same data, represented on a scatter plot, in case you want to step back, and see the data all at once.  The two scales are the number of freshmen, and the percent from the region selected.



Which states export the most students, and when they export them, where do those students end up?  If you've wondered that--or if you're from Illinois or New Jersey and lament our students' mobility--this is the visualization for you.

Choose a year, and see (on the top bars, in purpley-mauve) which states exported the most students.  Then, click on a bar representing a state to see where students from that state enrolled, in the bottom chart.  If you want the college destinations to be limited to public or private, or a certain region, you can use those controls to do so.



And finally, if you're interested in which states keep students at home, you can see that, too, on this visualization. The top view looks at colleges in a state, and where their students come from; the bottom looks at students from that state, and whether they go out-of-state or stay in-state.  Again, choose a year or institutional type, if you want to look at colleges or students going to those types of colleges.



I hope you have enjoyed looking at this data as much as I have enjoyed playing with it. If you spot any errors that I've made (Tableau still has no spell check....) let me know, and I'll get to fixing them right away. Otherwise, leave a comment below with questions or observations.

Wednesday, December 13, 2017

How Many Colleges are There in America?

Seems like an easy question: There are 7,284 post-secondary options in the US.

But everyone has a different definition of what they want when they ask for a count of colleges.  This should give you some clearer sense of the right answer for you.

At top left is "The Answer," and that will not change as you navigate through this.  But you can use the controls here to change the number of colleges and universities you're looking at, and to change how they're broken out.

Those controls change the number (in orange, at top) and the splits.

For instance, at the far right, on the control labeled "Region, choose "Great Lakes," and you'll see that there are 1,079.  On the gray box at top right, choose "State" and you'll see 354 in Ohio.  Under "Control of Institution" choose "Public" and you'll get 266.  And so on.  Now break out by "Campus Location" and see most are located in cities.

The reset button is at lower right.

I hope this is helpful to you as you wonder about the shape and size of American higher education.


Monday, December 11, 2017

What's All The Fuss About, Redux

My tireless crusade continues.

Everywhere you look, it seems most of the discussion and ink spent on higher education focuses on the most selective institutions in America.  In addition, if you listen to parents and students and counselors talk, you'll learn that there is a perception that college is increasingly hard to get into.

So, I broke the whole world of 1.403 four-year private, not-for-profit and public colleges and universities into bands, based on the absurd input measure of their freshman selectivity.  On the visualization below, they range from red (less than 15% admitted) to purple (over 60%) admitted.

Each institution falls into one of these boxes.

The four charts, clockwise from top left: The number of colleges in those categories, the number of freshmen they enroll, the total number of freshmen with a Pell grant, and the total undergraduate enrollment.

If you think you see a lot of purple, you do.  And this is before anyone enforces any sort of standard definition of what an "applicant" is.  Sometimes, it's just a person who accidentally clicks on an email link.

Of course, sometimes the scarcity of a good is exactly why people freak out about it. And of course, this doesn't even consider open admissions colleges (nine percent of all college enrollment in the US is in California's Community College System). So, this won't change the world, but I feel better for sharing.  Now you can't say you weren't told.


Friday, December 1, 2017

2016 IPEDS Admissions Data

Fresh from IPEDS, just months after the wrap up of the 2017 admissions cycle, comes the 2016 admissions data.

I've done something a little different this year to focus your attention, using five views of data, navigable via the tabs across the top of the visualization:


Admissions data (first tab) is pretty clear.  Colleges display admit rates (overall, in red) and then admit rates by gender (men are in blue; women are in orange).  If the blue bar extends beyond the orange, you can see that the admit rate for men is higher, and vice versa.

On the right are standardized test scores, showing calculated means.  In other words, since no one publishes averages and everyone wants them, I took the mid-point of the 25th and 75th percentiles to approximate the 50th percentile.  Note that IPEDS does not allow colleges that are test-optional to report test score information.  Also note that I've taken out a lot of colleges with extremely limited or suspect data.

As always, you can play with the filters (if there are any) to limit the colleges displayed, and you can sort columns by hovering until you see this little icon, and then click on it.

You can reset the view by clicking this little icon at lower right.

The four other views show a limited scope of colleges: Selective, wealthy, mostly men, and Land Grant institutions, and plotted them using some variables that should both answer and generate questions.

You be the judge. 



Tuesday, August 15, 2017

Chasing the Endowment Unicorn

Higher education is struggling these days, and there are a lot of solutions from a lot of pundits, all of which tend to be macro in nature: Delivery, cost structures, optimization, curricular adaptations, and many other ideas abound.

On the micro level., however, the vast majority of the 1,700 or so private, four-year colleges and universities will point to "increasing our endowment" as one of the most crucial solutions to our internal institutional challenges.

This is, in all probability, because the wealthiest institutions in the nation (in terms of endowment resources) are also the best known, and much of the brand of any institution is driven by wealth and reputation and prestige.  And even in this decade and these trying times, some of these institutions have parlayed considerable investment income into one-year operating surpluses of over a billion dollars. No, that's not a typo; it's a problem every university president would love to have. (Reminder to self: Update this chart.)

I once had a finance professor suggest that every institution should multiply the amount of money spent on Advancement each year by 20, then consider these options:

Let's say your Advancement Office budget is $8 million per year.  It would take an endowment increase of about $160 million to throw off that $8 million in cash each year forever (at 5%). Thus, shutting down the Advancement function completely would be the equivalent of raising $160 million in unrestricted endowment overnight. Unrestricted dollars are the hardest to raise, of course, because people don't tend to say, "Here's five million dollars; do with it whatever you want."

(It's also a good time to remind people that much endowment money is restricted; the $20 million gift from a big donor doesn't usually provide general operating relief but instead is used to fund some center or institute or faculty chair the donor thought was a good idea.  So in some sense, total value of the endowment can be occasionally misleading. It's still generally better to be bigger, though.)

Due to head starts and compounding, the wealthiest institutions are so far ahead of the rest of us that even trying to catch up seems futile.  Of course, that stops no one from relying on the old "tried and true."  In reality, our only hope of catching up with them would be a catastrophic market crash with no rebound; even then, we'd all be poor.  No solace there.

Take a look at the interactive visualization below.  Each bubble is an institution.  Hover over a bubble for details.
  • The SIZE of the bubble indicates endowment value at the end of FY 15 (probably June 30, 2015)
  • The COLOR of the bubble indicates tuition dependency (in IPEDS, "Percent of core revenues from tuition and fees.) Orange is low; blue is high.
  • The relative position on the y-axis (up and down) indicates one-year endowment value change (note: This is just subtraction, so it is not endowment performance).
  • The relative position on the x-axis (left and right) shows the one-year percentage change.  I cut it at 50% each way for clarity as there were a few extreme outliers.
If you'd like, you can use the filters at the top right to limit the types of institutions shown, or the range of endowment values.  Use the highlighter at the top left to highlight a specific institution.  Just start typing any part of the name to do so.

How do you feel now?


Saturday, July 8, 2017

Changes in College Attendance by State and Ethnicity, 2005-2015

Note: If you haven't read my post about the 2016 election results and educational attainment, it might be of interest to read that first.  Or later.  Or not at all. Your choice.

This started simply enough: A couple of tables from the Digest of Education Statistics, (tables 302.65 and 302.70) showing the percentage of adults aged 18-24 who were attending a degree-granting college by state and ethnicity in 2005 and 2015.  If you've read this blog enough, you know I have a love/hate relationship with the digest: Great data, but horrible formatting.  The tables are made to be printed on a single 8" x 11" sheet and handed out.  The crucial distinction between data and insight is lost.

Regardless, I reformatted the sheets into something workable for Tableau, and started to look at them. I wasn't having much luck: Some of the states didn't have data on African-American students, for instance, in 2005.  The variable for "Asian/Pacific Islander" was relatively new then, and only a few states had that data available.  Beyond that, I was looking to add some color-coding into the visualization to help make a point, and it wasn't going well.

But I've been fascinated since the election by some of the tweets and writing of Chris Arnade and Sarah Kendzior, who are thinking about what the election results mean in "flyover land."  And my blog post about the election results and attainment has stuck with me, mostly because of the reaction people had to it.

So I colored the states by the 2016 election results, and it got more interesting, as you can perhaps see below.

It's easy for us to look at things like this and chalk it up to "uneducated people voted for Trump." While that may technically be true, leaving it at that makes it too convenient for us in higher education to forget that educational attainment is only partially something you earn; it's also something you're born into.  Some of the ten charts on this post might make that clearer.

This can also, of course, be a post about urban and rural, divides. The division in our country might be as much about opportunity as it is about attainment.  If history tells us anything, it's that people start to rebel when they feel they don't have a chance via any other path.

So as we look at the current reality, the question, as always, remains: What are we doing to change the future?