Tuesday, November 17, 2015

How will demographics change enrollment?

Ever since I started in admissions, people have been talking about demographics changes and challenges, and the chant continues.  The future, we're told, will look very different than the present.

Our trade paper, the Chronicle of Higher Education, ran an article about how this might affect higher education.  It included lots of interesting charts and graphs, but didn't allow me to look at the data in the ways I wanted to.  So I downloaded it and started looking at it using Tableau.

This is as much a testament to self-service BI as it is to the trends in the data.  I've often spoken about the 80/80 rule of business intelligence: 80% of what an analyst gives you, you don't need; 80% of what you want isn't in the report.  I spent a long time playing with and slicing this data to see if I could find a way to present it that makes sense, and that gives people what they want.  And every time I answered a question, I generated several more ("what if" can waste a lot of time.")

In the end, after several different views, I settled on the first one, below.  It's very simple, yet it gives you the flexibility find out most of what you need.

On the chance that you want or need something else, though, I kept the other views I had been experimenting with.

View 2: Maps and Details allows you to see the data mapped; once you filter to a region, you can see how states compare.

View 3: Changes with a State over Time looks at the same data four ways: Numbers, percent change, percent of total, and numeric change by ethnicity.

View 4: Counties Mapped allows you to select a state and see where concentrations of ethnicities live; choose a state, choose the ethnic group and age of the population, and see the results.

View 5: States and Counties shows ethnic percentages for every county, listed by state.

View 6: Counties shows all counties regardless of state.  Did you know there are 40 counties in the US where every 18-year old is white? Or that one county in South Dakota is 98% Native American?

Some notes about the data are on the CHE website.  Be sure to read them so you know what this shows and doesn't show.

Again, remember to interact.  You can't break anything.

And if the frame is not displaying the visualization correctly, you can go right to the original on the Tableau Public website.

Tuesday, September 29, 2015

The Pell Partnership Data

Yesterday's big news, of course, was the announcement of "The Coalition," the curiously-named group of about 80 colleges and universities making for very strange bedfellows.  I wrote a little bit about it here.

Today I came across a little data set that contained information about Pell graduation rates and non-Pell graduation rates, and I thought it an interesting opportunity to look at it in light of yesterday's news.  So, over my lunch hour, I did (yes, this software is really that easy to use. You should try it out.)

It's presented here in Tableau Story Points.  Just use the gray boxes across the top to look at the different views of the data.  Most of it should be self-explanatory, but if not, leave a comment and I'll reply to it.

FYI, there were several schools from "The Coalition" who did not supply Pell Grant Grad Rate data. In alpha order, they are:

  • Columbia (NY)
  • Hamilton (NY)
  • Harvard (MA)
  • Rutgers (NJ)
And you can ask them why they didn't.  I would never speculate about such things. (Olin College of Engineering did not provide data claiming their sample size was too small.)

Monday, September 28, 2015

The Peacekeeper Missile Comes to Admissions

Maybe  you're too young to remember the Ronald Reagan presidency, but one of the things I remember most is the "Peacekeeper Missile." People were incensed by what they believed to be political doublespeak worthy of the book 1984.  Missiles were objects of destruction, not something you associated with peace.  Change the language, change the discussion.

So today, this happened.  In what Inside Higher Ed is calling "An Admissions Revolution," eighty of the country's top colleges have formed a "Coalition," (a nice political sounding word: I mean, they form coalitions in Canada, so it must be nice, right?) to create a new application as well as a new portfolio system for students, who can start as early as the 9th grade, to assemble documents and other resources, not unlike my suggestion about Google managing the application process.  The goal, ostensibly, is to get more low-income and first generation students interested and ready to go to college, and to apply to these mostly-selective institutions.

This sounds great, right? Right?  You'd think so.

Of course, if you know anything about college admissions, your first question might be this: Today, one day after the announcement, which group is probably more aware of The Coalition?  A) first generation, low-income, students of color from under-resourced high schools, or B) white students of wealthier, college-educated families who already being planning for college at--or well before--the 9th grade.  I'll give you a moment.

In an industry already obsessed with prestige, this sounds like a club that won't take just anyone as a member, unlike the Common App, which has recently--God help us all--begun to allow colleges to determine for themselves what admissions criteria are important.

The collective gasp from the super selective members of Common App sounded like a Rockefeller in the presence of someone who extended the wrong pinkie finger when drinking tea.  "We just can't have these, these, Commoners, in the Common App," they decided without discerning a hint of irony, and they started their own country club, which of course, will do the requisite charity work one expects of any decent country club.

The standards for membership are fairly arbitrary: A 70% graduation rate for all members; for privates, a pledge to meet "demonstrated need," (a patently ridiculous term both in definition and in the way it's practiced) and for publics, "affordable tuition for and need-based aid for in-state students."

Does that seem backwards to you?  Shouldn't public institutions, which I believe were generally founded by the public for the public, be held to a higher standard of serving, you know, the public they're supposed to serve?  And of course, remember my frequent rant that high graduation rates are an input, not an output.  Even as blunt an instrument as US News and World report recognizes that if you enroll more Pell grant recipients, your graduation rate will drop.

Which brings me to the last point.  These institutions are, for the most part, selected from the institutions that a) have the most resources, b) charge the most, and c) enroll the fewest Pell grant kids.  Is this new application, which fragments the process even further, and clearly--not even possibly, but clearly--favors wealthier kids really the answer?

Or is the name--The Coalition for Access, Affordability and Success--just a political ploy from institutions that don't really seem to know much about access in the first place?  A new take on the Peacekeeper Missile? An homage to 1984?

Look at this, showing about 1700 four-year private and public institutions, each as a bubble.  The Coalition institutions are in red, everyone else in gray.  Colleges to the right have higher median SAT scores in the freshman class (another proxy for wealth, of course); colleges lower on the chart have fewer Pell grant kids as a percentage of all freshmen.  Larger dots are wealthier.  Hover over any dot for details about that college.

The the two-bar chart on the top shows Pell Grant enrollment.

There is one filter, to allow you to look at all institutions, just public, or just private.  Go ahead, click. See if it makes much difference.  And remember:

“War is peace. 
Freedom is slavery. 
Ignorance is strength.”

“It's a beautiful thing, the destruction of words.” 

Tuesday, August 18, 2015

How Pell Grant Recipients Fare at America's 80 Largest Universities

On my train ride in this morning, I saw an article posted on Twitter about Pell Graduation rates at the 80 largest universities in America.  If you want to look at a boring table of static data, just click here.

But I wanted to see if there were any patterns, so I copied the table, pasted it into Excel and then opened in Tableau to visualize it.  I think it tells an interesting story, although the data set is unfortunately limited, and with no key to merge the data with another set, it loses some potential.

Start by looking at the first view.  For each institution, there are three columns: The overall six-year graduation rate; the six-year graduation rate of Pell recipients, and the spread, with the values on spread sorted from low to high.  In this instance, a negative number means Pell students graduate at a higher rate than the student body overall, and a positive number means just the opposite.  As you scroll down the list from top to bottom, ask yourself what makes the pattern make sense?  There are dozens, but all I could see was, "football," but you might see "big public research university."  Or something else all together.

If you want to sort by another column, hover over the axis until the little icon pops up and click away. The "reset" at lower left does just what it says it does.

The second view (on the tabs across the top) shows the Pell graduation rate scattered against the percentage of freshmen with Pell.  The bubbles are colored and sized by spread (blue and large are good for Pell students; red and small, not so much.)  Right away you see the pattern: If you enroll fewer Pell students, your Pell graduation rate is higher.  My hypothesis would be that more selective institutions (who have higher graduation rates overall) a) simply select the most capable from among the poor students they admit, and b) have more resources to fund the smaller percentage of low-income students.

What do you see?

Wednesday, August 12, 2015

Watch Out, Guys

Women have made tremendous strides in educational attainment of bachelor's degrees in the last half of 20th century and the first decade of the 21st.  And even though doctoral degrees have lagged behind, we can see dramatic changes there as well.

Take a look at this visualization using National Science Foundation Data (this link downloads the data for you in Excel as Table 14).  What you see over time is a dramatic increase in the number of women who earned doctorates since 1983, but also a shift in the percentage distributions. Women are now the majority in Life Sciences, Education, and Social Sciences, and close to dead even with men in all fields except Physical Sciences and Engineering.

The second view (using the tabs across the top) shows doctorate by broad discipline over time.  Use the filter at the top to compare men and women, or to see the totals.  Note the tremendous percentage growth in women in engineering since 1983: From 124 to 2,051, an increase of over 1,500%.

While it's not necessarily true that most doctoral recipients work in higher education, it's true that higher education gets most of its instructional faculty from doctoral recipients; the long, slow trend (assuming it will continue, or even just stabilize) means there are some interesting changes in store in the higher education labor force in the coming decades.  It's possible college faculty will look very different 20 years from now

What do you think?

P.S. You might also be interested in this, showing bachelor's attainment over time.

Tuesday, July 21, 2015

What Happens to 100 9th Grade Students in Your State?

While waiting for 2014 IPEDS data to come out, I've been searching the web for more good educational data to visualize, and came across this site, where I found a nice little data set.  It's from 2010, and tracks 9th graders through high school and college.

We typically think of looking at high school graduates and measuring how well they do, which is important, of course.  But you can have a high percentage of graduates enrolling in or graduating from college masking a problem of high school dropouts.  This data helps look at that.

For all the data here, assume you start with 100 students in 9th grade in the state:

  • What percentage of them graduate from high school?
  • What percentage of them enter college?
  • What percentage make it to the sophomore year of college?
  • What percentage graduate from college within 150% of normal time (in other words, within six years)?
Finally, there is another, more traditional measure included: The percentage of high school graduates who graduate from college.

The data are interesting by themselves, but I also rolled in census data of median family income by state in 2001, presumably the year the 9th grader tracking began.  It's by no means perfect: New York City and Elmira in New York, for instance; Dallas and Colorado City in Texas; or Hollywood and Fresno in California share very little except a state capitol. I've made no adjustments for purchasing power of a dollar, either.  The high incomes in Alaska mask a much higher cost of living, and the remoteness of the state and relative dearth of post-secondary options make its attainment rating skew low, in all probability.

  • On the first view, the map, hover over any state to get a popup chart.  Go to the top left corner of the 48 States map to zoom; resets are at the lower left of the visualization.  The states are colored by the percentage of high school graduates who earn a college degree.
  • One the second view, the scatter gram, the x-axis is always the rank of median family income. Choose any other value to plot on the y-axis.  The states are colored by region, and you should note that the axes are reversed, so a rank of 1 is high and to the right.
  • And, of the third view, a slope graph, where you can compare any two measures of educational attainment in the states by using the right and left controls.  The line connects the two ranks.
What do you see here? I'd love to hear your thoughts.

Thursday, July 16, 2015

Where did you go to college?

Many people in higher education are fascinated with prestige, whether we like to admit it or not.  The question, "Where did you go to college?" can carry a lot of weight in job interviews or even casual conversation as people get acquainted.

The National Science Foundation annually publishes data telling us the colleges that produce the most alumni who go on to earn a doctorate from a US institution in a given year.  It's not a great data set in itself, and some brave soul will take IPEDS degree data and merge it to show which of these institutions are the most efficient producer by discipline, but that's not what you'll find here.

On this visualization, any time you see a college listed (UCLA, for instance), it shows how many bachelor's graduates of that institution earned a doctorate in 2012.  It's not the university that awarded the doctorate; that could be anywhere in the US (The University of Texas, or Stanford, for instance.)

There is some interesting stuff here, even if you just stick to the first visualization, where you can choose a broad or specific field, and see which institution produces the most alumni who earn a doctorate. I've sorted them by Carnegie Type, so that Carleton, for instance, doesn't have its accomplishments diminished by the big research institutions.

One thing that jumped out is the surprisingly high percentage of doctorates in engineering earned by graduates of foreign colleges and universities.

What else do you see?

Wednesday, July 1, 2015

Tuition Transparency Ratings

The Federal Government released its Tuition Transparency Ratings today, to help students and parents find out how fast colleges are raising tuition and net price.  And as is the case with many well-meaning government programs, the data doesn't always tell you the whole story.

The top chart on this visualization show tuition and fees at about 6,000 colleges and universities; the light blue bar is 2011, and the orange square is 2013.  To the right is the two-year percentage increase.  If you want to limit your selections or sort the colleges differently, take a look at this image, which I've embellished with some instructions.  Click to view larger.

The second chart, at the bottom, shows net price for 2010 and 2011.  Net price is calculated after grant aid, which is only reported at the end of the year, which explains the delay.  It's pretty much the same: 2010 on the aqua bar, 2012 on the red dot, and percent change in the purple circle.  The filters and sorts work the same way on this one.

There are a couple of problems here: One is the data.  I could not find a single program on the New England Culinary Institute website that listed a tuition of $88,000, but that's the data shown here. There are several instances like that in this data; even if they are technically accurate because of the way a program is configured, it doesn't advance our understanding of the issue much.

But more important, net cost is a function of who enrolls and how much aid you can give: If you suddenly stopped enrolling middle-income students, or you have small enrollments, the results can be very volatile. Net cost is a remnant, not a target that can be tightly controlled.  And, it seems in many instances net cost is being calculated by different people in different ways over the two-year period.

Still, there is some good stuff here, I think.  Take a look and let me know.

Tuesday, June 23, 2015

Looking at Medical School Admissions

Most of the things I look at have to do with publicly available data sets, and that often means undergraduate admissions.  But while doing some investigation, I came across data from the American Association of Medical Colleges.  There's some interesting stuff there, and while it's formatted in a way that makes it really difficult to get to, it's worth a little work.  (I'm not convinced that the formatting isn't an attempt to keep less stubborn people from digging too deep on this; my request to get the data in a better format was ignored.)

Best thing I learned: In 2014, of the 49,480 applicants to medical school, 41.1%, or 20,343, enrolled. That's a far higher percentage than I would have thought, although it is lower than the 2003 rate of 47.5% (34,791 and 16,541, respectively.)  It's clear, of course, that most medical school applicants are very well qualified, so that number represents the best of the best, but the perception of medical school selectivity is driven by the rates at each individual institution (sometimes 5% or less); in fact, each student applies, on average, to about 15 medical colleges, which skew the numbers.  These numbers are just for M.D. admissions, not D.O. or other medical professions.

This visualization has seven views, and starts with an intro.  You can get to the other six by clicking the tabs across the top:

  • A scatter, showing each medical college, colored by region, on two scales: Total applications and the number of applications per seat
  • Historical data for MCAT and GPA performance for applicants and matriculants over time
  • Applications, by ethnicity.  These are in a heat map format; the orange squares represent the highest vales on that individual grid
  • Admit rates, by ethnicity.  This represents (I'm 99% sure) the chance that a student in the category show, represented by the intersection of column and row, was admitted to at least one of the schools she applied to
  • Applications per seat in the entering class, broken out by male, female, and in-state status
  • Matriculant diversity, shown as male/female and in-state/out-of-state
By the way, if you need some understanding of MCAT scores, you can see them by clicking here.

If you're like me, you have a lot of questions that are not answered by the data AAMC provides.  But it's still a good start.  What do you notice here?

Monday, June 8, 2015

Diversity of Institutions, by Type

A few posts ago, I wrote about where students of certain ethnicities went to college.  In other words, if you looked at all the Hispanic students in the US, we'd want to see where they go to college, and compare that to Asian students, or students of two or more races.  I asked whether a student's ethnicity determined where they go to college.

This is the same data, but it examines it at the other end: The colleges, and how diverse they are.  In other words, does your location, control, and size, and Carnegie Type, for instance, determine how diverse you are, or limit how diverse you can become?

Again, the answer is no, but you can find some interesting trends.

If you're timid about using Tableau and interacting with it, here's your chance.

  • First, choose an Ethnicity in the top left corner.  For instance, assume you want to display the percentage of enrollment that is Asian.
  • Then, choose what value you want to display along the y-axis (the left side, from top to bottom)
  • Choose how to display the x-axis using two controls.  If you want just one dimension along the x-axis, make it the same variable for both x-axis controls.
Using the default values, look at the top right box.  This means that at the two private-for-profit Doctoral/Research Institutions in the Western States, the undergraduate enrollment is 81.6% non-white.  Hover over the box for details.

Now, click on that box, and the bar charts at the bottom update to show you those three schools, and the percentage of the student body of the ethnicity indicated.

As always, if you get stuck, just use the undo or reset buttons at the lower left:

There is a LOT to play with here.  What do you notice?

Monday, June 1, 2015

Enrollment at Women's Colleges, 2005 to 2013

Note: I got an email from Dean Kilgore at Mount Saint Mary's in California, who indicated I'd downloaded data for the wrong Mount Saint Mary College:In this case, the one in New York. I had to create the list manually, and it was just a slip on my part.

Sorry about that. I've removed them from the analysis, but unfortunately, can't add the correct one at this time without a considerable amount of work.

Sweet Briar College in Virginia recently announced, to the shock of many in higher education, that it would be closing at the end of this spring, 2015 term.  As often happens when a college decides to close, those who are or were close to it rally the troops and wage a fierce campaign to try to keep it open.  Sometimes it works, other times, it doesn't.

The scene playing out is not unusual: Allegations of secret deals, incompetence, blindness to all that is and was good at Sweet Briar. This is what happens when you decide to close a college.  And although I'm not taking sides, I did write before that the closing does seem to be curious in light of what little publicly available financial data there is: If you had to pick a college from this list that was going to close, it probably wouldn't be Sweet Briar.  Even the federal rankings of financial responsibility gave Sweet Briar a 3, a score higher than Harvard, which may only point out how absurd those ratings are in the first place.

A while ago, I downloaded a pretty extensive data set, using the members of the Women's College Coalition as my base.  Not all colleges have data available in IPEDS, however, so I did the best I could (for instance, the Women's College at Rutgers is not in IPEDS as a separate institution, or if it is, I couldn't find it.  And I took out Saint Mary of the Woods, as they just announced they're going co-ed).  Also, since there is no IPEDS data field that tells you when a college is a women's college, I couldn't go back and find out how many were labeled as such 20 years ago.  That might have been interesting.

Overall, though, the data were pretty uninteresting.  So I gave up on visualizing it.  There were trends, of course, but nothing dramatic.

So, when I saw this article, by one of the people leading the charge on the Save Sweet Briar campaign, one sentence jumped out at me:

Enrollment: There is no evidence that enrollment is declining, either at Sweet Briar or at women’s or liberal arts colleges. This claim is simply false. Numbers people, please check for yourself: The data are publicly available.

The data are available, and the link goes to the IPEDS site I use all the time.  So, take a look here. There are five views of the data, using the tabs across the top.  The first shows changes in freshman, total, undergraduate, and graduate enrollment over time.  The changes on the right are shown in relation to the prior year.  The second shows the same data, but the change is cumulative since 2005: As you can see, total undergraduate enrollment is down almost 6% during a time enrollment increased nationally.  The third shows admissions activity; the fourth breaks it out, showing Sweet Briar and all the other women's colleges in aggregate.  And the fifth shows total undergraduate enrollment in 2005 and 2013 (on the left) and change (on the right.)  As you can see, there are some big winners, big losers and a lot of small changes.

Decide for yourself.  And tell me what you see:

Wednesday, May 27, 2015

Does Ethnicity Determine Where You Go to College?

The answer to the headline, of course, is "no."  Race is not determinant of where you go to college, but race--or more probably the factors that vary with race and ethnicity--may influence your college choice set, which can, of course, influence where you go to college.

I've written before about how all these variables are at play with each other: In America, race, income, parental attainment, and presumably, opportunity, all cluster together.

And, after you look at this, you'll see how opportunity gets distributed by race, provided you're willing to click a button or two.  The visualization starts off showing all undergraduate enrollment in almost 7,000 post-secondary institutions who report to IPEDS.  (And before you object, I've believe strongly that the college you attend is not your destiny, and that education is what you make of it, as I've written before on my other blog. But it's also clear that many people believe talent congregates at the "best" colleges, and the way this plays out in hiring and graduation school admissions can be troublesome.)

As you can see, almost 75% of all undergraduates go to a public institution; the majority of them go to Associate's granting institutions (almost all community colleges.)

But use the control at the top right to see how the distribution changes: Try Black or Hispanic or Asian or White to watch the bars move.  What you see is a change: Hispanic and African-American students go to community colleges and for-profits at much higher rates than their White and Asian peers.  White students go to private, not-for-profits at higher rates than almost any group, except Non-residents.

International (Non-resident, here) students flock to research universities, but are also far more likely than any group to attend a private university.  This is because they avoid for-profits in great numbers.

What else do you notice?

If you want to limit the population, feel free to use any of the filters, but beware that the percent of totals are taken on the base of the sub-population you've chosen, not the entire population.  You'll notice there are further differences by geography and campus urbanization among other things. I'd love to hear what you turn up that's intriguing.

Friday, May 22, 2015

Endowments Over Time

Is it true that the rich get richer, as suggested by a recent Washington Post article discussing college endowments?  Probably.

But what's more interesting is the pecking order in the world of endowments, and how it's changed (or not) over the last decade or so.  I downloaded trend data from IPEDS showing beginning of year endowment by fiscal year for about 160 four-year private institutions with enrollments of at least 5,000.  To be sure, there are some very well endowed colleges who have smaller enrollments (Williams, Pomona, Grinnell, etc.) but this chart could get messy pretty fast.

I pulled the data into Tableau and created a bump chart, showing not the value over time (because everyone does that) but rather the rank of all the endowments shown.  So, for instance, if a line slopes down, it might mean a college has gone from 10th to 20; a slope up might show the opposite.

But what we see here are lines that are essentially flat.  There are a couple anomalies, of course, and as with IPEDS always, a warning about human errors in data entry, incompetence, or other agendas at work is in order.  When data was missing for a year, I simply put the prior year's data in its place for continuity.

I started with 30, but you can show more or fewer using the slider at the top.  Just pull it to whatever number of colleges to display.

See anything interesting?

Wednesday, May 20, 2015

A Look at Federal Loans for Students

We hear all the time about the student loan bubble.  Is it a problem?  Like most things, it depends on how you look at it.  Here is data viewed from a very high level, without the benefit of being able to drill down.

It shows the status of the government's student loan portfolio, broken out by the older FFEL programs and the currently existing Direct Loan Program; the FFEL loans are older, while the Direct Loans are a mixture of old and new.

Some interesting trends appear, even over relatively short periods of time.  What do you see?

And, if we had access to better, more granular data, what would you like to see?

Tuesday, May 12, 2015

Yes, Your Yield Rate is Falling

A recent article in the Chronicle of Higher Education pointed out the things colleges are doing to bolster their yield rates.  This of course, raised an interesting question among many outside of higher education: What's a yield rate?

Colleges admit many more students than they want to enroll, of course.  But let's say you want to enroll a class of 1,000.  How many, exactly, do you need to admit?  Most of the students you admit will have more offers than the ones you send, and they can only enroll in one place. So, if you admit 2,000, you need exactly half of them to enroll, which would mean a yield rate of 50%.  If you're not confident you can get that kind of yield, you admit more: 2,500 with a 40% yield rate gets you that same number (40% of 2,500).  But with most institutions, yield rates are closer to 30%, so that means 3,333.  Or thereabouts.

Alas, many colleges are afraid of that admit rate (the percentage of applicants admitted) getting too high, because for many parents and students, a low admission rate is a proxy for quality: An admit rate of 15% means (to some) that an institution is better than one with an admit rate of 30%.  And so on.  Part of what they do is generate "softer" applications, via a variety of methods I've talked about many times, ad nauseam. But the problem is that you don't know precisely who a soft app is, so you can't just take the same number of students, because the soft apps (with lower propensity to yield) will bring down your yield rate.  So admit rates fall, but ultimately, so do yield rates.

Managing and reviewing more applications is expensive, and if you care greatly about that admit rate, you try to keep it as low as possible and still make your class, by raising the yield.  Looking at demonstrated interest is one way; using financial aid more strategically is another; and finally, good old fashioned tactical approaches are another still.  Many places use all three.

Here is what our wheel spinning and tail chasing has spawned: Thirteen years of increasing applications, increasing admit rates, and decreasing yield and draw rates (draw is a better measure of market position vis-a-vis competitors because it punishes you if you try to appear more selective at the price of yield).

The first view here shows colleges in groups, starting with all 1,432 public and private not-for-profit, four-year, degree granting colleges in the US that admit freshmen in the traditional Carnegie classifications (Baccalaureate, Master's and Doctoral, excluding Baccalaureate-Associates colleges). You can use the filters to look at any combination of variables you'd like to see how things have changed.

The second view (using the tab across the top) allows you to use the filter to select any single college. And, if you're like most people, the first ones you select will be the big names, who trends appear to move in the opposite direction of the industry as a whole.  Which means, that for all those institutions trying to look like they're in the RBL (REALLY Big Leagues), all your effort has put you farther behind.

What do you see? Leave a comment below.

Friday, April 17, 2015

State to State Migration of Freshman

Previously, I did an analysis of colleges, showing which states freshmen came from in 2012.  It was very popular with people who are interested in the topic of geographic diversity.

But I heard another admissions person say last week (while I was at Missouri ACAC) that her college wants to enroll more students from outside the state.  It's not hard to figure out why: Students who cross borders (or who travel farther to college) are generally wealthier and have parents who are college-educated themselves, both of which make these students attractive targets.  Since the number of these students who are likely to migrate is essentially fixed in any given year, the intent to recruit more of them is really an effort to get a bigger piece of the pie.

But some states have naturally stronger pulls; others export more because of a relative lack of opportunity based on a smaller selection of colleges.  Where do students in your state come from?  Is your state enrolling a lot of students from out of state, or do you send a lot away?

Here is some 2012 IPEDS data for you to look at.


  • There are two views of the data (note the tabs at the top). 
  • On each of these two views, the columns--that is, the states listed across the top--represent the state where the colleges are located.  
  • The rows represent the home states of the freshmen (this includes only domestic students, not international students on visas).  
  • The box is colored by the percentage of students in the college's state that come from the state on the row.

For instance, hovering over this box on the first view, you'll see that 24.6% of students at colleges in Alabama came from Georgia:

The first (orange and gray) view shows only non-resident students; the second view (with the purple boxes) shows all freshmen.  On that one, notice that Iowas enrolls 20% of all freshmen from Illinois. Knock it off, Iowa. Note the diagonal band where each state intersects with itself.  And note the range in color on that diagonal.

How attractive is your state, and to which ones? It's an important question to ask before you assume you can swoop into a state and get more students.

Friday, April 3, 2015

Using an Ecologists' Measure of Diversity in Higher Education

Diversity is a topic a lot of us in higher education think about and write about and work towards, and yet, we don't really have a common definition of what it means. At its most basic level, we simply talk about the percentage of our students who are non-white. And, of course, if you compare colleges today to those in the 1950's, this makes perfect sense, and allows us to give ourselves all a pat on the back.

But the success of Asian students over the past few decades has complicated this: While they are not white, their large numbers at the nation's most selective institutions, and performance on college admissions examinations, makes us occasionally shift the discussion to under-represented students of color, which today might include Native American and Alaska Natives, Latino or Hispanic students, African-American students, Asians who are Hawaiian or Pacific Islander, and students of two or more races or ethnicity. This of course causes us to wonder whether a student of mixed Asian/Caucasian ethnicity should count, and to remember that technically, Hispanic is not a race. It's all very confusing.

On top of that, there are institutions who serve large numbers of under-represented students (HBCUs, for instance) that are not very diverse in the clinical sense: Almost everyone enrolled in those institutions are African-American. How do we think about decribing diversity that makes sense to everyone?

One way to do it is to use a measure called Simpson's Diversity Index. You can read about it here if you'd like, but it essentially says that once you come up with a category and count the population, you can calculate the likelihood that choosing any two members at random presents a mismatch of type. For instance, at a college in Puerto Rico, if you randomly select two students, the chances they are of different ethnicities is probably very small: You'll usually get two Hispanic students. Go to Howard University, and odds are you'll select two African-American students on your trials. This translates into a lower Simpson's number. If you have a university that is truly more diverse in the ecological sense, you'll see that number go up.  All the numbers in the index are between zero and one.

Of course, it's short-sighted to measure diversity just on race or ethnicity, but it's the thing we have the best data on. We can add other elements into the mix, but since the data are pre-aggregated, we cannot break the groups into subgroups (for instance, wealthy White students vs. poor White students.) This would yield better insight.

Look below. The first view shows all four-year, public and private not-for-profit colleges and universities in the US, and their Simpson's Diversity Index as calculated from total undergradute enrollment in 2013 Fall. On the first view, the bars are colored by freshman admissions rate, with an interesting theory suggesting that if your admit rate is low, you could be more diverse if you really wanted to be. In the tool tip that pops up when you hover over a bar (like in the screenshot right below), you'll see the breakdown of enrollment by ethnicity.

And if you hover over several bars in the same range, you'll see you can get to similar numbers in very different ways. So, even among diverse institutions, there are very different student body mixes in play.

On the second tab, you'll see some element of economic diversity added in: Pell Grant eligibility as a color. The chart is a scatter of Simpson's and Admission rates.

One note: I calculated the index two ways, using as the base number only those with known ethnicity, and then those whose ethnicity was not listed.  I think the first number is probably a better tool, but I did include it the second in the tool if you're interested.

Do you see anything interesting here? I'd love to hear it.

Wednesday, April 1, 2015

Sorry, Harvard, Princeton, Yale, and Stanford. You Lose

April 1, 2015

This week, all the hype about college admissions comes out.  Blah blah blah this college admitted only 7%.  Blah blah blah Oh Yeah? We admitted only 6%! We're better.

So, I says to myself, "Myself, I says, what is the real measure of the best college?"  And it became clear: The best college is the one everyone wants to copy!" And then I asked myself, "How do you do that?"  "Why," I replied, "by copying the name."

So here are the most common names of colleges in America.  As you can see, Columbia College and Bryan University are duking it out for the top spot.  If DeVry got their act together, they could win every year, as an informal analysis says there are about a bazillion of them.

So, sorry Duke (if that is your real name.)  Clearly, no one wants to be associated with the likes of you.

Tuesday, March 31, 2015

How Admissions Has Changed, in One Chart

I frequently hear that the interactive charts I publish are too confusing or time-consuming, and that it's hard to get the story out of them without some work. So today, I'm making it easier for you, for two reasons: First, this is real student data, not summaries: Each dot represents a student who applied for financial aid, so I'd never publish that data on the web; this is just a good, old-fashioned picture of a chart.  Second, in this case, one chart tells the whole story.

The population here is all freshman financial aid applicants who completed a FAFSA but did not have need.

Each column is one year, and each dot in that column represents a student; higher positions in the column show higher income, from zero to one million dollars in parental AGI (adjusted gross income).  This is arrayed in a box-and-whiskers, or box plot.  The yellow boxes show the limits of the middle 50% of the distribution (the "box") with the color break representing the median.  The top whisker (the black horizontal lines) represent the 75th percentile.  In other words, 25% of the applicants have incomes above that line.  The bottom whisker is the lowest 25th percent. Yes, there are people with very low incomes who do not qualify for need-based aid, usually due to large asset bases.

Note the way the black line rises over time, from about $430,000 in 2007 to almost $600,000 in the last two years.  There are several possible explanations for this, all of which are probably valid to some extent.

  • It's a buyer's market, and college recruitment activities have brought in people who are shopping in more places
  • People who never would have applied for aid in prior years are doing so, because the crisis of 2007 has evaporated many assets, like home equity, that people might have used to pay for college
  • Other colleges are requiring a FAFSA for merit aid consideration so we get the FAFSA as a residual.  No one, it seems, is opposed to trying to get a lower cost
  • Colleges are so afraid of losing someone due to price considerations they encourage everyone to "give it a shot" and see if they are eligible.
One note: In 2014 we had 31 applicants whose income was $1,000,000 or more who are not shown here, and who would have brought the distribution up. These people used to show up in prior years as $999,999 dollars, so I took them out for equal comparisons. And, in anticipation of the next bump, we did have one family who reported an AGI of $9,999,999 for 2014 when they completed the FAFSA.

This post shows Financial Aid data, but the title says it's about how admissions has changed. What do you think? How are the two related?

Tuesday, March 24, 2015

Application Fees

Ever since my first day in admissions, I've had a big problem with the concept of college application fees.  They just seem odd to me: You pay some amount of money for the privilege of being considered for admission, often not certain you'll attend if you are.  And if you're not admitted, you're out of luck.

I understand those who support the concept, in concept: Students shouldn't apply to a lot of colleges, and they should be somewhat serious about the colleges they apply to.  Except we know that doesn't happen.  The counselor at my kids' school said a few years ago one student applied to 46, and the Fast Apps, Snap Apps, and VIP apps all encourage students to apply to places just because they can.

I also realize that there are costs associated with processing applications, although those costs have dropped pretty dramatically in the past several years, especially when all the documents come in electronically. But all the costs of doing business are paid for by the students who pay tuition, and, presumably, more applications is good for the college they attend.

There may be other models where this system is used, but I'm not able to come up with any.  All I can think about is having to pay $50 just to walk onto the Toyota lot and shop for cars (which is hardly a perfect analogy, either.)

Too often in the discussion about things like "admit to deny," people will point out that app fees from students who have little chance of being admitted are a revenue source for colleges.  Technically yes, but actually no. At most institutions, it's about 1/10th of 1% of total revenue.

So, take a look at what colleges charge to apply.  This visualization starts with just under 2,000 four-year colleges and universities, each represented by a dot.  IPEDS apparently list the highest fee a college charges when there are multiple levels.

Hover over the dot for details.  The bar chart at the bottom shows the breakouts as a percent of total. Use the filters on the right to show a smaller set of colleges.

What do you notice?