Wednesday, April 1, 2015

Sorry, Harvard, Princeton, Yale, and Stanford. You Lose

April 1, 2015

This week, all the hype about college admissions comes out.  Blah blah blah this college admitted only 7%.  Blah blah blah Oh Yeah? We admitted only 6%! We're better.

So, I says to myself, "Myself, I says, what is the real measure of the best college?"  And it became clear: The best college is the one everyone wants to copy!" And then I asked myself, "How do you do that?"  "Why," I replied, "by copying the name."

So here are the most common names of colleges in America.  As you can see, Columbia College and Bryan University are duking it out for the top spot.  If DeVry got their act together, they could win every year, as an informal analysis says there are about a bazillion of them.

So, sorry Duke (if that is your real name.)  Clearly, no one wants to be associated with the likes of you.


Tuesday, March 31, 2015

How Admissions Has Changed, in One Chart

I frequently hear that the interactive charts I publish are too confusing or time-consuming, and that it's hard to get the story out of them without some work. So today, I'm making it easier for you, for two reasons: First, this is real student data, not summaries: Each dot represents a student who applied for financial aid, so I'd never publish that data on the web; this is just a good, old-fashioned picture of a chart.  Second, in this case, one chart tells the whole story.

The population here is all freshman financial aid applicants who completed a FAFSA but did not have need.

Each column is one year, and each dot in that column represents a student; higher positions in the column show higher income, from zero to one million dollars in parental AGI (adjusted gross income).  This is arrayed in a box-and-whiskers, or box plot.  The yellow boxes show the limits of the middle 50% of the distribution (the "box") with the color break representing the median.  The top whisker (the black horizontal lines) represent the 75th percentile.  In other words, 25% of the applicants have incomes above that line.  The bottom whisker is the lowest 25th percent. Yes, there are people with very low incomes who do not qualify for need-based aid, usually due to large asset bases.

Note the way the black line rises over time, from about $430,000 in 2007 to almost $600,000 in the last two years.  There are several possible explanations for this, all of which are probably valid to some extent.

  • It's a buyer's market, and college recruitment activities have brought in people who are shopping in more places
  • People who never would have applied for aid in prior years are doing so, because the crisis of 2007 has evaporated many assets, like home equity, that people might have used to pay for college
  • Other colleges are requiring a FAFSA for merit aid consideration so we get the FAFSA as a residual.  No one, it seems, is opposed to trying to get a lower cost
  • Colleges are so afraid of losing someone due to price considerations they encourage everyone to "give it a shot" and see if they are eligible.
One note: In 2014 we had 31 applicants whose income was $1,000,000 or more who are not shown here, and who would have brought the distribution up. These people used to show up in prior years as $999,999 dollars, so I took them out for equal comparisons. And, in anticipation of the next bump, we did have one family who reported an AGI of $9,999,999 for 2014 when they completed the FAFSA.

This post shows Financial Aid data, but the title says it's about how admissions has changed. What do you think? How are the two related?




Tuesday, March 24, 2015

Application Fees

Ever since my first day in admissions, I've had a big problem with the concept of college application fees.  They just seem odd to me: You pay some amount of money for the privilege of being considered for admission, often not certain you'll attend if you are.  And if you're not admitted, you're out of luck.

I understand those who support the concept, in concept: Students shouldn't apply to a lot of colleges, and they should be somewhat serious about the colleges they apply to.  Except we know that doesn't happen.  The counselor at my kids' school said a few years ago one student applied to 46, and the Fast Apps, Snap Apps, and VIP apps all encourage students to apply to places just because they can.

I also realize that there are costs associated with processing applications, although those costs have dropped pretty dramatically in the past several years, especially when all the documents come in electronically. But all the costs of doing business are paid for by the students who pay tuition, and, presumably, more applications is good for the college they attend.

There may be other models where this system is used, but I'm not able to come up with any.  All I can think about is having to pay $50 just to walk onto the Toyota lot and shop for cars (which is hardly a perfect analogy, either.)

Too often in the discussion about things like "admit to deny," people will point out that app fees from students who have little chance of being admitted are a revenue source for colleges.  Technically yes, but actually no. At most institutions, it's about 1/10th of 1% of total revenue.

So, take a look at what colleges charge to apply.  This visualization starts with just under 2,000 four-year colleges and universities, each represented by a dot.  IPEDS apparently list the highest fee a college charges when there are multiple levels.

Hover over the dot for details.  The bar chart at the bottom shows the breakouts as a percent of total. Use the filters on the right to show a smaller set of colleges.

What do you notice?



Monday, March 23, 2015

Another Way of Looking at Graduation Rates

Another article appeared in my Facebook feed about college ROI, although it was called the 50 Best Private Colleges for Earning Your Degree on Time.   As is often the case, there is nothing really wrong with the facts of the article: You see a nice little table showing the 50 Colleges with the highest graduation rate.

But it got me to thinking: What if high graduation rate wasn't enough?  What if a considerable portion of your freshman class that graduates takes longer than four years to do so? Is that a good deal?  Let's take some hypotheticals:

College A: 1000 freshmen, 800 who graduate within four years, 900 who graduate in five, and 950 who graduate in six.  So the four-, five-, and six-year graduation rates are 80%, 90%, and 95%.  But of the 950 who eventually graduate, only 84.2% do so in four years.

College B: 1000 freshmen, 750 who graduate within four years, 775 who graduate in five, and 800 who graduate in six.  So the four-, five-, and six-year graduation rates are 75%, 77.5%, and 80%. Thus, of the 800 who eventually graduate, almost 94% do so in four years.

College C: 1000 freshmen, 550 who graduate within four years, 600 who graduate in five, and 625 who graduate in six.  So the four-, five-, and six-year graduation rates are 55%, 60%, and 62.5%. Of the 625 who eventually graduate, 88% do so in four years.

If you were choosing among these three colleges, which might you choose?  The easy money says you go with College A, the one with the highest graduation rate. College B would be your second choice, and C would be your third.  But what if you are absolutely, positively certain you'll graduate from the college you choose? College B is first, then College C, then College A.

Data can be tricky.  And as I've written many times, things like graduation rates are really almost inputs, not outputs: If you choose wealthy, well-educated students, you're going to have higher graduation rates.  It's a classic case of making a silk purse out of, well, silk.

I've tried to demonstrate this in this visualization, and I like the simplicity here.  Each dot is a college (hover over it for details).  They're in boxes based on the average freshman ACT score across the top, and the percentage of students with Pell along the side.  The dots are colored by four-year graduation rates, and you should see right away the pattern that emerges.  Red dots (top right) tend to be selective colleges with fewer poor students.

But if you want to look at the chance a graduate will finish in four years, use the filter at the bottom right.  Find a number you like, pull the left slider up to it, and see who remains.  (Just a note: I'm a little suspicious of any number of 100% on this scale, which would mean absolutely no students who graduate take longer than four years to do so.  It might be true, but it's hard to believe. But I'd set the right slider to 99% at the most.)  Remember, there's a lot of bad IPEDS data out there, so don't place any bar bets on what you see here.

What do you see?



Wednesday, March 4, 2015

Are we all doomed?

If you follow media following higher education, you know that for a while, many have been (somewhat gleefully) predicting the demise of the whole industry.  High costs, MOOCs, a weak job market, and shrinking confidence in the value of a college degree are all conspiring, they would say, to create a perfect storm that will be the end of us all.

I'm not saying these people are wrong;  you can get in trouble arguing with self-proclaimed prophets, and until something either comes to fruition or it doesn't, all you have is a lot of heated discussion. Personally, I take exception to the smugness of some who seem to revel in their predictions.

But that is, as they say, why they make chocolate and vanilla.

The heat (if not the light) increased this week when Sweet Briar College in Virginia announced it was closing.  The pundits came out of the woodwork, proclaiming that this was just the first domino to fall, all the time apparently reveling in this presumptive proof of their collective acumen in predicting such things.

But a look at publicly available data makes it hard to predict such things; many colleges soldier on despite numbers that make them look vulnerable, while a college like Sweet Briar, which occupied a pretty good position on the second chart, below, found itself a victim of the most obvious college problem, namely enrollment that was not large enough to support itself.

You might think that Sweet Briar is the first of many.  You could say the industry is collapsing.  And you might be right.

But it seems there is nothing a prophet likes to point at more than evidence he might be right.  There is no one saying (yet) some of the other possible reasons things might have gone south, even though there is much more attention paid to this college than the one per month that has closed since 1969. If it later turns out (and I have no reason to believe it will) that this was a board with no vision, or a horrible case of mismanagement, or one of dozens of other possible reasons we can point to, the pundits are unlikely to correct what they're suggesting today.

So take a look at this.  On the first chart, you can see the array of colleges and universities, and with a click of a bubble, find out who's where.  On the second, you can put any college in context with a couple of clicks.  Have fun. Don't get too worked up over what you see.  It's not destiny.

As for me, I'll tell you what Mark Twain once said: “I was gratified to be able to answer promptly, and I did. I said I didn’t know.”  And I'm sticking to it.

Note: It's important to remember that IPEDS data that this is built with contains errors on occasion; don't make any bar bets on what you see here, and if your institution is incorrectly listed, take it up with your IR office.



Friday, February 27, 2015

Ten Years of Endowment Data

While the endowment of a private university is not a big investment pot from which universities draw income to spend at their discretion (some portion of every endowment is restricted to certain use), it's a very good proxy for institutional wealth.  What's always been interesting is the enormous size of the top five or six institutions, always led by Harvard, in comparison to everyone else.  And yet Princeton, which enrolls fewer students, has the largest per-FTE endowment.

This visualization shows two things.  On the top chart, it's a tree map, or what I like to call a sheet cake map.  Think of all the money in all the endowments as one big bowl of batter baked into a cake, and then, once baked, sliced up into pieces.  The size of the piece is that institution's endowment as a part of the whole.

The bottom chart shows ten years of endowments, measured at the start of the fiscal year shown, so you can see the hits in 2008--2009 and the overall growth over time.  Of interest There are only three private universities in US who had a total endowment in 2012 equal to the ten-year growth of Harvard's.

If you click on an institution, the line chart at the bottom will filter to just that college over time.  If you hover over a line on the bottom chart, it will highlight the instituion on the top so you can see its place in the endowment universe.

What do you see?



Wednesday, February 18, 2015

Four years of Ivy League Tax Returns

I love the Internet.  Thirty years ago, I couldn't have imagined being able to look up several years of tax returns for the Ivy League Colleges and Universities (let alone being interested in them.)  But Guidestar (a great site you should check out, in case you don't know it) comes to the rescue.  The documents are pdf, unfortunately, but you learn a lot by inputting the data manually into a spreadsheet.

For your information: By law, all universities that receive Title IV funding must make tax returns available to the public, so there is nothing clandestine about this.

The tax returns can show you, albeit at a very high level, at how the Ivy League Institutions generate revenue, and how they spend it. To no one's surprise, salaries and benefits dominate at almost all colleges and universities, and if you're really curious, the returns list in detail how much the officers and highest paid non-officers make.

But as I once suggested, the most interesting thing is the massive investment return these institutions generate; even the "poorest" of them--Brown University--averaged about $124 million in investment return over these four years.  Collectively, the investment return of these eight institutions averaged over $550 million per year, for a grand total of $18 billion over the four years. To put that in some perspective, there are about 1,553 private, not-for-profit, four-year colleges and universities in America with revenue data in IPEDS; 1,506 of them had total revenues of less than $550 million in 2013.

Take a spin around this.  It's fairly interesting for the most part, and very interesting for one reason: Princeton's 2013 data (from the 2012 Tax Return, which I've put here in case you want to take a look.)  The return shows an operating deficit of almost $1.3 billion, driven by an investment loss of over $800 million. I asked an expert on university finance (not affiliated with my own institution) about this, and here is what he said:

We were doing some analysis using IPEDS finance info and it showed some really weird results, with Princeton being the strangest of all.  It caused me to pull their audited financial statements and examine them.  Here’s a link to the statements in case you’re curious.  Nothing weird showed up in the statements so I attributed the problem to IPEDS and the Department of Education.  Now having looked at the 990, I believe Princeton has suffered some turnover among its finance staff and the folks doing their reporting don’t know what they’re doing.  As you will see, the financial statements appear to be quite different from what was reported in the tax return.

So, take this, and everything you read from publicly available data, with a grain of salt.




Friday, February 13, 2015

The Race Goes On

Unless you live under a rock, you probably know that colleges are, in general, interested in increasing the number of students who apply for admission.  There are a couple reasons for this, but they're all mostly based on the way things used to be: That is, before colleges started trying to intentionally increase applications.  The good old days, some might say.

In general, increasing applications used to mean a) you could select better students, who would be easier to teach, and who might reflect well on your college, or b) you as an admissions director could sleep a little better, because you were more certain you could fill the class, or c) your admission rate would go down, which is generally considered a sign of prestige.  After all, the best colleges have low admission rates, right?

Well, yes, one does have to admit that the colleges that spring to mind when one says "excellent" all tend to have low admission rates.  Lots of people want to go there, and thus, it must be good.  The trained eye might be able to spot the forgery, but what about the average person?

This week, we have another journalistic treatise presumably exposing colleges for the ways in which they attempt to increase applications.  The tactics listed in this article are nothing new: Reduce the essay, waive the fee, encourage more low-income kids.  Barely mentioned was the "Fast App/VIP app/Priority App," many colleges use that allow them to count an "applicant" as anyone who clicks an email link that says "Start your application."

However, application increases only pay off when you generate them from students who have a reasonable propensity to enroll.  Prestige can be measured by a little-used variable that punishes you when you increase applications to try to look more selective at the cost of deceasing your yield: It's called the Draw Rate, and it's a powerful indicator of market position.  It's a simple calculation: Yield rate/admit rate.

Here's a secret: For some percent of the freshman class, let's say 33%, recruitment doesn't come into play at all.  A large chunk of your enrollment is natural; that is, those students are likely to enroll no matter what you do.  The next 33% are going to enroll presuming you do everything correctly, make it affordable, and help them understand how they fit. But the last group, that final third, comes from students who have little predisposition to enroll.  Your recruitment tactics focus on them, and you spend most of your time trying to find them, get them to apply, and then to enroll.  They may make up as much as 75% of your pool.  They enter your applicant pool with a lower level of interest.

The problem is that usually, a big increase in applications comes not from the first or second group, and not even the third, but rather a fourth group, the "Ain't no way I'm going to enroll short of a miracle" group.  The bigger problem is that you don't always know exactly who these students are. This is one of the reasons demonstrated interest has become a topic of discussion.

When you artificially increase applications, and you have to cover your ass by admitting more, your yield is going to drop.  And so will your draw rate.

So, let's look at the data.  These charts start out very busy, so you should interact by selecting just a region or Carnegie type.  But even at their busy mess stage, you can see: a) applications are up, b) admits are up, and c) yield rates are down at almost every type of institution, with the exception of the big, private, research universities.  The ones you can rattle off without thinking too much.

But look at the Draw Rates, on the last two charts.  Draw rates are down across the board, mostly because capacity is relatively constant, the supply of students is down, and competition is up.  The only winners in the battle to increase prestige? The ones who were prestigious in the first place.  The money spent trying to join that club, or sometimes even just to look more like them, could have been put to better use.

Use the boxes across the top to see the six points of this Tableau Story Points visualization.  Note that the last one exposes some data anomalies which are inherent in IPEDS, often due to typos or new IR staff who count the wrong thing (my alma mater in 2010-2011, for instance.)

What do you see? And what do you think?  Is the race for prestige dooming us? Or is it just the latest evolutionary stage in the natural process of competition?




Thursday, February 12, 2015

Degrees Awarded by State

Frankly, the data are a little boring when you first try to visualize them.  When you're looking at the number of degrees awarded by discipline and by state, California, Texas, and New York win pretty much everything.  That's no surprise, of course, as they're the largest states with the most college students.

So I broke it into regions, thinking there must be some differences in the degrees awarded in different parts of the country.  Nope.  The Middle Atlantic wins.  That's where the people are.

Finally, I looked at each state by the percentage of degrees in certain fields, and voila! Something interesting. Different states award different types of degrees in dramatically different proportions. Some of this can be answered easily: A high percentage of business and computer science degrees in Arizona is driven by the University of Phoenix, but others are not so obvious. Why is there such disparity when you look at humanities, engineering, or health professions?

To interact, just select the type of degree in the purple box at the top.  It starts with business, but you can choose anything.  The maps and the bar charts will update to show each state, and the percentage of bachelor's degrees in that state in the discipline selected.

Any explanations?


Thursday, February 5, 2015

When Infographics Fail

There are a lot of bad infographics floating around the Internet.  When they concern things like the difference between cats and dogs, or how many hot dogs and hamburgers Americans eat over the 4th of July, it's no big deal.

But this blog is about higher education data, and when I see bad infographics on that topic, I feel compelled to respond.  This one is so bad it's almost in the "I can't even," category.  It takes very interesting and compelling data--The graduation rates of Black male athletes--and compares it to overall graduation rates at several big football schools in the nation.  Here it is:


For starters, this chart appears to stack bars when they shouldn't be stacked: A graduation rate of 40% for one group and 40% for another group shouldn't add up to 80%.  The effect is that it distorts much of what your brain tries to figure out.  For instance, look at the overall rates (longer bars) for Georgia Tech and Pittsburgh;  Georgia Tech at 79% is shorter than Pittsburgh's at 77%, because they started at different points.

But wait, they can't be stacked; Louisville's 44% + 47% is way longer than Notre Dame's 81%. Stacked bars on dual axes?

These also look at first like they could be two sets of bars, with one (the overall graduation rate, which is always higher) behind the Black male graduation rate.  But that can't be, either.  The effect is that you look at Notre Dame and see very long gap between 81% and 96% (a 15-point spread) that appears to be longer than the 37-point spread at Virginia.

In short, I cannot tell you how this chart was made, or what the assumptions are, let alone what the story really is.

And the image behind the picture is even worse; it makes it hard to see.

Finally, a third element might have been interesting here: The graduation rate of Black males who are not athletes.  It might shed more light on the problem, although if the same designer did it, I'd not be confident.

Here's the data presented three ways, each of which tells the story differently, but each better in at least one way. This was literally 15 minutes of work.

What do you think?






Tuesday, February 3, 2015

A Remake of the Pell Institute Data

I would have written a shorter letter, but I did not have time. --Pascal

And sometimes it's that way with data visualization, too.  What starts out as a simple project--one you think will take a few minutes--gets slightly more complicated.  This morning, I came across this interesting Chronicle of Higher Education story, showing Pell Institute Data on Economic Diversity. If you don't want to look at the article, here is a screen grab of the chart.  Click on it to enlarge.


It's not a bad chart, but I found myself taking more time than I thought I should to figure out the story, which is that for-profit institutions enroll far higher percentages of low-income students than public institutions.  (Another problem with this that I can't fix is that the data are not complete; for instance, there is no data on private Baccalaureate or Master's Institutions included in the set.) Additionally, notice the subtle changes in color when you move from the overall category to any of the institutional types. I think that can be confusing.  And finally, as is the case with all stacked bar charts, it's hard to compare middle values.

The biggest problem, though, is that, there are too many things to take in at once. The story can get lost in the chart.

So I did an attempt at a quick visualization, but a small data set (seven columns by five rows) seems to make it harder to get good insight, and I spent a lot more time on this than I thought I would.

This is still not perfect, of course. For instance, it's hard to compare one institutional type to another, unless you have a good memory.  But clicking the drop down box to change the type of institution shown on the bars, and comparing that to the average (orange lines) seems to work pretty well. 

Let me know what you think: Does this make it easier to see the story?




Friday, January 30, 2015

A (better) look at the NACUBO Data

Last night, I gave a little demo of Tableau Software to graduate class, and tried to make the point that big, long, detailed spreadsheet reports are like a teenage daughter: The information you get sometimes seems like it's only given to meet the minimum requirement of reporting, not to allow you to extract any insight.

This seems to be true with the people at NACUBO, too, who each year release a study of endowment values and one-year performance.  I'd encourage you to click on that link to see exactly what they provide.  Not only is the document lacking any insight about trends of the shape of the market, it's boring.  Most people will look at the top 15 or 20, and then go down the list to find institutions they know and make some comparisons.

Extracting the data is difficult, and even when you do, it's laden with characters that should be stripped off or cleaned up due to footnotes and other caveats. Moreover, there is no ID number attached to the colleges (like an IPEDS ID) so you can't merge other information into it to get real insight (like endowment to operating budget ratios, for instance).  Still, I did the best I could, then visualized it.  It's below.

The top chart shows every university on two scales: 2014 Endowment value on the y-axis, and one-year percent change on the x-axis.  They are duplicated on the bottom bar chart.

It gets fun when you use the filters: Eliminate the super-endowments, zoom in on big or small gains by dollar or percent; or use the regions to narrow the geographic view.  (Note: One college had a 213% increase from 2013 to 2014.  You can reveal it if you want by using the second filter.  Pull the right handle all the way to the right.) You can't break it; I promise. But if you get stuck, you can always reset by clicking the small circular arrow at the bottom.



Wednesday, January 28, 2015

Colleges or Investment Firms?

I've worked at a wide range of colleges and universities in my career: From a tiny little college with lots of adults and transfers and commuters, to a classic liberal arts college, to one of the country's best known and wealthiest colleges, to a place just coming out of financial exigency, to one of the largest private universities in the country.  Money, in case you didn't know, makes a difference.

At one of those places, I was docked 18 cents on my first expense report reimbursement, because I had rounded up a tip, making it more than the 15% allowed by college policy.  But it was at one of the most heavily endowed colleges in the nation, at least on a per student basis.  I remember telling this to my mother, who only remarked, "Well, I guess now you know how they got it."

So today's Chronicle of Higher Education article about the "Huge Explosion of Wealth" and the resultant $37.5 Billion in contributions to colleges and universities last year.  And, as you might suspect, the biggest recipient, at an astounding $1.6 Billion, was Harvard.

So, I thought it might be interesting to look at money in higher education to answer my question: Which colleges are raising money on the side, and which investment firms are running colleges on the side?

It's below, using Tableau Software's Story Points.  Click inside the gray boxes at the top to see a different view of the data set.  This starts by breaking these 745 private colleges into two groups: Those who get more revenue from investment return than they get from tuition (orange) and those who get more from tuition than investment return (purple).

This is for a sort of high level flyover; accounting is complicated, and something as simple as growth in endowment can be attributable to a variety of things, such as big gifts or shrewd investment strategies. Not all endowment funds can be spent on financial aid (many of them are restricted) and every university has different missions, and different student bodies (for instance, more graduate students whom you fully fund can get expensive).

But take a look and let me know if you learn anything.  And if you tweet this out, I'd appreciate you tagging me @JonBoeckenstedt .



Wednesday, January 21, 2015

Which Colleges Graduate the Most Students of Color?

This is a quick and easy little visualization to digest, I hope.

Using IPEDS data, I wanted to look at which colleges graduated the most and the highest percentage students of color.  So here it is.

Use the blue control boxes at right to choose a State, Carnegie Classification(s) and Control, whether public, private, and/or for-profit.  You can use the sliders if you want to look at schools with a certain size range, or only schools that award a certain percentage of degrees to students of color.  The bars are color coded by control.

The left hand column is fixed to show total Bachelor's degrees awarded in 2013.  Use the top right box to control what displays in the center and right-hand column, for instance Asian students, or Hispanic Students, or, the default, All Underrepresented Students of Color (Hispanic, African-American, Pacific Islander, Native American, and Two or More Races.)  The center column shows the number; the right hand column divides the center by the left to get a percentage.  You can also choose White or Asian students, if you wish.

You can sort the institutions by any column just by hovering over the little icon at the bottom near the axis label.

Have fun.  Let me know what you find interesting.


Monday, January 19, 2015

The Hemingway Version of a Faulkner Story

Note: Tableau Guru Jeffrey Shaffer suggested I change from a red/green palette to one that's better for people who cannot distinguish between those two colors.  I changed it to include one view with orange/purple, but kept the original as well.

My undergraduate degree is in English Literature, and so I've read a lot of things I didn't like. In one American literature class I remember, the two heavyweights of the course were William Faulkner and Ernest Hemingway, and the difference in their literary styles made an impression on me.  I'm reminder of this exchange of criticisms:

Faulkner: "Hemingway has never been known to use a word that might send the reader to a dictionary."
Hemingway: "Poor Faulkner.  Does he really think big emotions come from big words?"

And so it goes with Story Telling With Data.  I downloaded an interesting data table from the Digest of Education Statistics, and worked for a long time, trying to find some interesting way to display the data.  I had a story board with four dashboards, but nothing was telling anything that was compelling. Part of the problem is that the patterns are hidden in the 50 states and three different types of FTE enrollment: Public, Private, and Private For-Profit.

And then it happened, and the charts tell the story almost without words.  On the left is change in FTE (Full-time Equivalent) enrollment by state from 2000 to 2010, broken out by sector.  Notice: Almost every state in all three views are green, showing positive numbers.  The worst is the khaki color, showing low increases.  

In the right column, it's a very different story.  Lots of red, concentrated, interestingly enough, in publics and the for-profit sector.  You can hover over a state for details, but the patterns are pretty clear, even without doing so.

There. A story, with no big words, and just a few pictures.



Thursday, January 8, 2015

Freshman Migration by State

One of the more popular and interesting (IMHO) posts on this blog was the one on freshman migration.  It's interesting to see, I think, where colleges and universities draw their freshman from.

Normally, I think it's best to start at a high level and drill down, but this time, I'm going in reverse. This is a much higher-level view of the data, by state, and it includes only recent high school graduates who go to a four-year college (you might call them traditional freshmen, even though they're actually in the minority.)  But there is still a lot of interesting stuff here, I think.

You've seen how to manipulate a Tableau visualization (or if you haven't click here), so use those skills to see how many interesting tidbits you can find in this visualization.  Hover over the top or bottom of the column to sort the column, using the little icons there.  Here's a factoid to start: The percentage of freshmen who stay in state is the highest in Utah.

What else can you find?



Tuesday, January 6, 2015

Looking at Catholic Colleges and Access

It's the kind of headline that grabs attention: Catholic colleges tell poor students: Go somewhere else.  And it certainly generated some discussion on Twitter and within my own university.  The article was written by Paul Moses, who is a Journalism Professor at Brooklyn College, and who seems to have a strong interest in social justice and Catholic topics, based on his tweets.

When you work at a Catholic college or university, service to the poor is something you talk about all the time.  At my own institution, where about 28% of freshmen receive the Pell Grant, we pride ourselves that the commitment is in our mission statement.  So I thought the topic deserved something more than the one-dimensional, high-level examination the article offered.

I went to IPEDS and downloaded some data, which is presented here.  To start with, I've included only private institutions, filtering to those with admission rates below 70%, as these institutions have some flexibility in shaping the freshman class, which leads to conscious choices about trade-offs other non-selective institutions don't have to make.  If you want to change that, just slide the control on the top right to suit your taste. There are 1,283 private institutions in this data set, but not all have complete data; the top view currently shows 658 after filtering, colored by religious affiliation. Hover over a dot for details. Click on the pencil in the legend to highlight one group.

Additionally, I've allowed you to filter by broad geographic region, and I think you'll find this instructive as you dive deeper into the data.

The top scatter gram arrays the institutions on two scales: The x-axis is set to the net price for students with family incomes of $30,000 or less.  Using the control in the pink box, you can change that.  The y-axis shows the percent of freshmen with Pell Grants, the grant for the neediest students in the country (usually with family incomes below $50,000.)

The bottom chart shows three views, and uses the same filters as the top view, but aggregates them to show the data by religious affiliation.  Note that because the data are limited, it's impossible to weight these statistics by enrollment, as you'd normally want to do, so this is an average of the averages, and far from perfect.  Still, the three values displayed--average net costs, average percent on Pell, and average endowment assets per FTE (full-time equivalent student)--can be instructive.

While I'd never accuse anyone from the Eastern Time Zone has having a particular bias about the rest of the country, it does appear that things vary by region; moreover, the results vary by the income band selected too. Note the differences between net price for very low income bands and other groups.  What you see across the board, and not just at Catholic colleges, is that low income students pay a lot more as a percent of family income than wealthier students.

While it's easy to select a hypothesis and cherry pick some data to support it, it's a little harder to go deeper and understand that Catholic Colleges look a lot like other private colleges when it comes to this issue.  Yes, Catholic colleges should do more.  No, they are not the only ones who should be singled out.

What's not always obvious are the resources available to do more of the lifting.  In that regard, look at the last column: Endowment assets.  What you see is that Catholic colleges are doing a lot with a little, and as I've said before, the wealthiest, most prestigious universities in the nation are not doing their fair share of the lifting.

What do you think? What do you see?



Wednesday, December 17, 2014

Another 1000 Words and Ten Charts on First-generation, Low-income, and Minority Students

I have always enjoyed writing, and I consider this and my other blog like a hobby.  Usually, I spend no more than 45 minutes on any post, as I don't make my living by writing, and my blogs are not "monetized." But once in a while, an opportunity presents itself to write for a wider audience, and that's when I see what it takes to make a living putting words to paper. That happened this week.

You may have seen my opinion piece in the Chronicle of Higher Education. If not, you can read it first, read it last, or not at all; I think both this and that stand alone, despite their relationship.  In the end, we ended up with about 40% of my first draft, which is what happens when you write for a print publication. And of course, a print publication makes interactive charts, well, difficult.

I think there is more to say on the topic, because the similarities in recruitment challenges for first-generation, low-income, and minority students tend to look a lot alike, and the more data you look at, the clearer this all becomes. So I took what I had pulled together to research and write the CHE piece, formatted it for this blog, and put it here with the understanding I'd not publish until the Chronicle piece was online.

Frankly, this topic is a little bit personal for me.  My parents, born on farms in Iowa in 1916 and 1917, were among those who never attended high school, let alone college; of their four children, I'm the only one with a college degree. So I was a quintessential first-generation college student, and as I wrote on my other blog, my father made $17,000 in his best year, (or about $48,000 today's dollars) which would qualify us as low-income by most standards. In fact, I've always pointed to those factors as powerful influences in my choice of a profession.

I think the story is an interesting one, and I hope you'll stick with it to the end.

Point 1: The challenge of increasing attainment is not a new one, and America has made considerable progress in the past on this front. Consider this mind-blowing statistic: In 1940, about 60% of adults, like my parents, didn't make it past 8th grade. A lot of that generation who lived through the Depression and World War II sent their kids to college in large numbers in the 1960's and 1970's, increasing attainment rates dramatically.

The first chart here is very detailed and interactive, and the second is very simple, but they are based on and show the same data.

With all the charts, hover over a data point for details.






Point 2: Is education inherited?  That is, does it get passed on from parents to students like silverware and pocket watches?  Some data from the Brookings Institution suggests so: If your parents are educated, you're likely to be, and if they're not, the same is probably true of you. Anyone who's done admissions understands there is a cultural component and a context sensitivity that are important when recruiting first-generation students. Students whose parents have never been to college don't navigate the process as well as those who do.

To read this chart, look at each column, representing the educational attainment of fathers, from low on the left, to high on the right.  Each column adds to 100 or so (rounding errors) and the boxes represent where 100 children of fathers with this level of education end up in the educational attainment spectrum. The children of fathers with lower attainment end up near the bottom in greater numbers; the children of well educated fathers end up near the top (blue boxes are higher numbers). Orange boxes (low numbers) show up when we see how many children move in either direction; gray boxes are numbers in the middle.

Feel free to argue nature, nurture, or systematic issues here.  We like to think education is based on meritocracy; in fact, it may be an "inheritocracy."



Point 3: If education is inherited, this presents a problem that affects students of color, because African-American and Hispanic students are less likely to live in homes where a parent has a bachelor's degree.  Thus, they cannot "inherit" educational attainment. So ethnicity = attainment, to a large degree.




Point 4: And not surprisingly, college attendance rates already vary dramatically by both income:




And by ethnicity:


So income = ethnicity and ethnicity = attainment and attainment leads to attainment. (And I use the term "equals" pretty liberally here, of course.  It's obvious that nothing is destiny.)

Point 5: There is also the issue of affordability: Students of color tend to come from families with lower income levels. And I doubt we need a chart to demonstrate the relationship between parental education and family income.  But it may be that believing you can't afford a college education, accomplishment notwithstanding, may be the biggest barrier to educational attainment we have.



On one of the criteria that is weighted heavily in admissions, standardized test scores, students of color and low-income students tend to score lower.  If colleges and universities are serious about enrolling more first-generation students, low-income students, or students of color, they need to take a serious look at the weight of tests in the admissions process, and the extent to which they let the public judge them on freshman test scores.  While I'm generally agnostic about the US News and World Report rankings and the pursuit of them, this is the one area where I believe criticism is justified.

The 50 boxes for each ethnicity (five test score bands by ten income levels) are colored to show the percent of total by ethnicity, and those values add up to 100. Blue values are higher, and orange values are lower.  So you notice that the bluest box for African-American and Hispanic students is in the lower left corner (low income, low scores); for Asian and Caucasian, it's near the upper right (high income, high scores).  Hover over any box for details; this is from ACT EIS Data for 2010 and is created by compiling spreadsheets extracted from the software.


Of course, no one believes that being Hispanic or poor is the direct cause of lower attainment on standardized tests. Many believe the lower scores are proxy values for opportunity, and there are many who believe that standardized tests simply reflect the accumulation of social capital throughout a child's lifetime.  And much of the social capital necessary to be admitted to college comes with wealth and parental attainment.

Point 6: And finally, I offer these two charts.  First, the relationship between freshman test scores, ethnicity and Pell at American Colleges and Universities. You can look at private and/or public by using the filter, but regardless, you can see the relationships played out: In general, the higher your freshman class SAT, the lower the percentage of freshmen on Pell, and the less diverse you are. Thus, when we ask universities to be "excellent" and we define "excellence" by input variables like SAT or ACT scores and selectivity, this is what we're left with: Colleges who want to do the right thing have to act counter to their own interests.


And this: The relationship between institutional wealth and economic profile of the freshman class. You can click here to read the blog post when I originally published it, if you'd like.




So, I hope the points and the patterns in them are obvious: When we talk about increasing college attendance among first-generation, low-income, and minority students we're really talking about the same thing.  And all those factors pull, unintentionally, against the very things many colleges and universities strive to achieve.  It will take some very brave institutions at the top of the food chain to break out of the pack and move us collectively in the right direction.

I think it's a vicious circle of our own doing. What do you think?





Monday, December 15, 2014

Looking at Student Loan Default Rates

Student Loan defaults make a lot of news, but there is not a lot of understanding about what a default actually is, and there is not good, easily accessible data on default rates, nor a lot of good contextual analysis.  But this may help a little.

First, the source of the data is here.  You should read it, especially  the part about small numbers of students entering payment, or small percentages of students taking loans at a college skewing default rates.  You should also know that the definition of a default is being at least 270 days behind on a payment.

This is not the easiest data to work with.  For one thing, the file layout descriptions don't match the file; Financial Aid uses a different ID than IPEDS, and the crosswalk tables that might help you figure out the IPEDS ID (to get a richer view of context) use a different format than this table does. In addition the "Region" doesn't roll up the states in any way I've seen before, and the "Program Type" also puts colleges in categories that don't always make sense.  For most four-year institutions, try "Traditional" first in the selector box.

But here it is.

If you want to eliminate the small schools that skew things, you can use the "Borrowers Entering Repayment 2009--11" filter.  You can just type the ranges in the boxes and hit enter, or use the sliders.  You can also limit to states or region, in any combination.

A reminder that outputs are sometimes actually inputs.  If you enroll high ability, wealthy students, and are very selective in admissions, your default rates are going to be lower than other institutions that take more chances on students who come from low-income or less-prepared backgrounds.  It would be great if there were a way to recognize the institutions with lower default rates who took more risks.

What jumps out at you?



Thursday, November 20, 2014

What is the Pell Grant Worth?

The Pell Grant got its start as the Basic Education Opportunity Grant, or BEOG sometime in the 1970's; it was later named for Rhode Island Senator Claiborne Pell.  The idea was simple: To provide a basic level of financial support for students from low income families who aspired to go to college.

It's almost certainly had a lot to do with increased levels of educational attainment in America, but rapid tuition increases, coupled with lower increases for the Pell, means the gap between Pell and tuition has gotten bigger over time.

The College Board has complied a lot of good data on this and other financial aid trends in its report, updated annually, on its Higher Education Trends site, where you can download the data. Unfortunately, the data looks like this when you get it.  Maybe you can extract the insight; I can't.



So I pulled it into Tableau and spent an hour or so with it to see what I could find.  It's below.  The College Board has calculated enrollment-weighted average tuition by type (4-year public and 4-year private, not-for-profit) which makes comparisons easier, and has adjusted everything (including the maximum Pell Grant) for inflation.  You can see on the three views what's happened to tuition, fees, and Pell; how much they've changed on a percentage basis; and the purchasing power of Pell over time.

Occasionally (OK, frequently) I've criticized highly selective institutions for enrolling very low percentages of Pell Grant students in their freshman class.  If you wanted to argue that they don't make much business sense, you might have a point.  But you'd also be right in pointing out that the diminished purchasing power of Pell is due in large part to rapid increases in tuition. So there's plenty of blame to go around.

What else do you see?