Thursday, February 11, 2016

Graduation Rates by Selectivity: Freshmen, 2007

This is the second part of my visualization of graduation rates from NCES. Part I is right below this one, or if  you want, you can click here to open it in a new window.

People in higher ed, and especially in government, talk a lot about graduation rates, and the presumption is this: That graduation rates are something we credit or blame on the colleges; that is, something a particular college does determines whether or not its graduation rate is high.  If Princeton stopped caring, presumably, its graduation rate would collapse.

Well, maybe.  Probably not, though.

We can see that a single factor, such as percentage of students in the freshman class with Pell, or the mean SAT score, can predict with some precision the graduation rate of a college or university.  If you don't believe me, see for yourself.

There is some variation in rates of colleges with similar profiles, of course, and people believe--correctly, or incorrectly, I'm not sure--that this is the important difference, or the value added by the particular college.  Maybe, but given the percentage of variance explained by single variables, I'm willing to guess other pre-college characteristics explain a lot of that unexplained variance.  Even as dull an instrument as US News and World Report realized years ago that having more Pell students lowered your graduation rate, all other things being equal.

Which leads us to this: The entering freshman class of 2007, and their six-year graduation rate, broken out by gender, ethnicity, and the selectivity of the college. You can see the pattern: The more selective the school, the higher the graduation rate.

Consider this.  You are headmaster at a college where they only thing they teach is dunking a basketball.  At the end of the course of study, students are given a test: 100 attempts to dunk the basketball.  And your school has a dunk percentage of 74.3%, the highest in the nation, and far better than any other Dunking College in the US.  All the people in Tallsville, where you're located, are very proud of you, as you educate mostly local kids from Tallsville, named for the Tall family.

The next year, you get ten times as many applicants.  And, being a college that wants to turn out the best dunkers (it's in your strategic plan, of course), you are suddenly able to admit only the tallest applicants, with the biggest vertical jumps and the largest hands.  Using the same instructional tools you've always used, your dunking percentage skyrockets to 98.2%.  And next year, guess what happens to applications? And guess whom you select from that pool?

The nation's oldest and wealthiest colleges mostly had a head start of several hundred years on the rest of us. And in times when college was almost exclusively the bastion of wealthy, white men from the upper crust of society, they have long histories of turning out men who end up, not surprisingly, wealthy and white.

Their reputation ensures that their position in the market will be strong for as far as the eye can see, and will allow them to select only students who, albeit not always white, wealthy men anymore, are destined to graduate from college.  If you're a little less selective, you have a little less luxury of choice.  And so it goes.

There is, of course, nothing wrong with that.  But choosing a college because of its graduation rate is backwards: The college will select you based on your propensity to graduate. Ponder that.

Do you agree? Or not?  Either way, I'd love to hear from you.



Tuesday, February 9, 2016

Graduation Rates, Rolled Up

I like the NCES Digest of Education Statistics, but some of the reports they present are almost unusable.  If you've ever tried to visualize a report like this, you know what I mean.  If anyone from NCES is reading this, please help and encourage the good people there to put data in a cleaned, unformatted report.

But, on to the data.  This is pretty simple, actually, and it bounces off previous visualizations I've done that show graduation rates are as much an input as an output.  The data are presented in three views: Over time, summarized for a single year, and by institutional type.  Click on the tabs to see them all, and use the filters on the right to select subsets.  You may want to look at women, or Hispanic students, for instance, and you can do so here.

There are some interesting patterns here.  It's clear that women, across the board, have higher graduation rates than men; and that colleges are not serving African American men very well.

What else do  you see? Anything surprising? Leave a comment below.

See this, too, for another way of looking at college graduation rates.



Friday, February 5, 2016

In-state enrollment and Pell

A recent article in the Washington Post piqued my interest: Why the University of Oregon turned to neighboring states for students, by colleague Roger Thompson.

Some of this is no surprise, of course. I've been looking at NCES and WICHE data for years, and even visualized the latter to show how demographics will change enrollment profiles at colleges across the country.

Lots of publics realize this, and lots have attempted to enroll larger numbers and percentages of students from outside their states. There's more to it than population, however: It's one of education's worst-kept secrets that students who travel farther to college come from families with higher incomes (or vice versa, of course), and in general, so do students who cross state lines for their education. Public universities have discovered that high out-of-state tuition makes them less desirable, and so have recently adopted a revenue maximization model, often offering large discounts to non-resident students.  It's generally better to get 70% of $30,000 than 0% of $30,000, especially when resident students might only pay $12,500, assuming the enrollment boost is sufficient.

This is a natural reaction by the universities in light of massive state funding cutbacks.

To boot, wealthier students have higher test scores, on average, and at prestigious-conscious universities, this can be an added bonus.

The danger, of course, is that this might exclude lower-income in-state students.  So, I took a look at the two factors over time at about 550 public colleges and universities.  The visualization below gives you the option to compare any four institutions by choosing the ones you want from the individual drop-down boxes: Orange lines show percent of freshmen from in-state; purple shows percent with Pell over time.  In order to select a college, just click on the box and start typing any part of the name. If you want UC Santa Cruz, for instance, you'll get better luck with "Cruz" than "California."

I started with four at random.  Make your own set.

What do you see? And are you surprised?

Note: Since I posted this, a colleague has pointed out that the IPEDS data are not sufficiently granular to separate in-state and out-of-state Pell students, which would have been the ideal way to look at this.  And, in addition, I'll add that a university could a) increase the percentage of its state's HS graduates enrolling, b) increase the number of Pell students, and c) see the percentage of Pell students go down if the freshman class grew substantially.





Wednesday, February 3, 2016

Degrees awarded by Discipline, Ethnicity, and Gender, 2011 to 2013

This is three years of data from the NCES Digest of Education Statistics, breaking out all bachelor's degrees awarded by ethnicity, gender, and discipline.  For the sake of clarity, I rolled many of the disciplines together, and on at least one view, rolled up ethnicities into groups as well.

The first view simply takes a look at ethnicity and gender: What do Asian women, or Hispanic men study in college?  Eight views on one dashboard, showing some interesting stuff: 30% of Asian women study Science and Math, compared to just 9.5% of African American men.  Business always dominates with men, except Hispanic men.  Interesting.






Behold the power of DataViz.  This view is the exact same day, just shown a different way to allow you to get a comparative view.  This shows all ethnic groups in the data set, however, and the data in columns adds up to 100%.  So, for instance, in the very top left, of all degrees awarded to Asian women, 19.82% were in business.  The figure is 34.3% for nonresident (international) men.




The third view turns it all around. Here you can see all the degrees awarded in a specific discipline in those three years, and see how they were distributed.  For instance, of all the degrees awarded in Education, 65% went to Caucasian women; of all the engineering degrees, 8.3% went to Asian males.




The fourth view is a little more complex, and allows you to create your own view.  For each discipline shown, the colored bars add up to 100% for the groups selected.  At first, it's a little noisy: Both men and women, and all ethnicities.  But this is where you can get interactive.  Look at just Hispanic students, for instance, by de-selecting every thing else; or see just men, if that's what you want.  The bars will always recalculate, and the very bottom bar rolls up all degrees into one bar, for comparison.

What do you see that jumps out at you?  Let me know in the comments at the bottom.

Thursday, January 21, 2016

2014 IPEDS Admissions Data

This is always a popular post: Statistics on the entering class of 2014 at about 1900 colleges and universities across the country.  It's based on IPEDS data, which I downloaded from the IPEDS data center and conditioned.  The source file is here, if you'd like to do something with it yourself.

This year, NCES only reports test score ranges for those colleges and universities that require tests for all applicants; in some regard, this makes sense, but it's unfortunate.  At my institution, for instance, about 94% of enrolling students submit tests, and this data might be helpful to students who do plan to apply with tests.  I plan to let NCES know this was not a good idea, and you can, too, if you'd like.  For now you'll know why these colleges don't show up.  You'll have to check with the colleges themselves.

This view starts with private, Liberal Arts Colleges in the Great Lakes region, but you can make the list be whatever you want using the filters across the top.  Be aware that if you select "New England," for instance, you can't then select "Florida" until you re-set the region filter.

The views from the top down are:


  • Admit rates, with the overall rate on the left, and men and women on the right
  • ACT Scores, at the 25th and 75th percentile
  • SAT CR Scores, at the 25th and 75th percentile
  • SAT M Scores, at the 25th and 75th percentile
You'll have to scroll down to see them all four boxes, and within each box, use the scroll bar.





Friday, January 15, 2016

The latest Boogey Man: Frontloading

It's happened three times in the last several months: I am invited into, or stumble into, a discussion on "Frontloading."  It's been the case that the people who are talking about it are generally convinced it exists, and generally believe it's a widely practiced approach.

In case you don't know, frontloading is the presumed practice of enrollment managers (of course) who make big institutional aid awards to entice freshmen to enroll, and then remove them after the freshman year.  Journalists, especially, point to aggregated data suggesting that the average amount of institutional aid for non-freshmen is lower than for freshmen. "Aha!" they scream, "The smoking gun!"

Well, not so fast.  I'm willing to admit that there may be a few colleges in the US where frontloading happens, probably in a clandestine manner, but perhaps, in at least one instance I was made aware of, for a very logical and justifiable reason.  But most enrollment managers I've asked have the same reaction to the concept: To do so would be career suicide.  This does not deter those who love to skewer enrollment management and hoist the problems of higher education squarely on our backs.

To be sure, I asked a Facebook group of 9,000 college admissions officers, high school counselors, and independent college consultants about the practice.  This is not a group of wallflowers, and the group members call it like they see it; even so, I asked them to message me privately if there were colleges where this routinely happened.  I got a couple of "I think maybe it happens," responses, and exactly one comment from a counselor who said she was sure it happened.

I have told people repeatedly that there are many possible reasons why the data look the way they do:


  • The freshman data is first-time, full-time, degree seeking students.  All of them are technically eligible for institutional aid.
  • The "All students" data includes all undergraduates.  That includes full-time, part-time, non-degree seeking students, many of whom are less likely to qualify for aid.
  • The "All students" group also contains transfers, who may qualify for less aid
  • It's possible that students who earned aid as a freshman lose it due to lack of academic progress or performance in subsequent years
  • It's also possible students with the most institutional aid are the neediest, and thus not likely to be around past the freshman year
These reasons seem to fall on deaf ears of people who are eager to prove something they believe to be true.

So I tried another angle: Doing the same comparison of Pell Grant participation.  The Pell Grant, of course, is a federal program, not awarded by or controlled by the colleges.  What would happen if we looked at Pell Grant participation rates among freshmen and the rest of the student body?  I think this visualization, below, demonstrates it quite well.

There are four columns: 

  • Percent of freshmen with Pell
  • Percent of all other undergraduates (non-freshmen) with Pell
  • The percentage of undergraduates who are freshmen (in a traditional college, this should be about 25%-30%.  Bigger or smaller numbers can tip off a different mix)
  • And finally, the right hand column, showing the difference between the first two.
The bars are color-coded: Blue bars show more freshmen with Pell; red bars show fewer freshmen with Pell, and gray bars show no difference.  You'll note that most bars are blue; if this were institutional aid, you might leap to the conclusion of frontloading.  That's exactly what journalists do.

But it's not institutional aid.  It's federal aid.  And yet, the differences are very large at some places.

You can hover over the top of any column to sort by that column.  And you can use the filters at the right to limit the colleges included on the view to certain regions, control, or Carnegie type.

Is this evidence strong enough to convince you?  If not, let me know why in the comments below.



Wednesday, January 6, 2016

In Which I Break the Rules

I've had a long-standing rule when publishing to this blog: I don't take requests.  This is for two reasons: First, I do this for fun, and I publish what's interesting to me, hoping you'll find it compelling as well.  Second, the tools available now allow you to answer your own questions fairly easily.  If you like my visualizations but want to ask some more questions, you can download free versions of Tableau Public and explore to your heart's content.

But yesterday, when I published this piece on freshman migration, a topic that generated a lot of interest this time and the first time I did it, I admitted this was the most fun I had exploring data.  And I also admitted that I had dozens of views lined up, but, in the interest of keeping things simple, kept just two.

Since it's gone live, I've had about 20 people ask me "Could you look at this data this way?" questions.  That's exciting, because I'm always hoping what I publish generates as many questions as it answers.

So, for once, I'm taking requests.  And, even some suggestions. specifically from Ian Pytlarz and Carolyn Rockafellow of a Google+ Group for Higher Ed users of Tableau.  Thanks to them.

Six views here, all in one workbook, and accessible via tabs across the top:


  • State exports: Numbers and percentages in a scatter
  • State "stay homes: Numbers and percentages in a scatter
  • Bar charts of exports, showing number and percentages.  You can sort by either column
  • Pie charts of all 50 states and DC: (Another rule I broke, as pie charts are not very good for such things, but some people like them, and if you don't you can skip them and get the information elsewhere)
  • Individual college enrollments by in-state/out-of-state, or in-region/out-of-region
  • Enrollments in colleges within a state in aggregate, showing percentage from in-state and out-of-state
But that's it!  I may do some more internal analysis for our own use here at DePaul, but if you want more, you can make yourself data independent (and I'd love to see what you do with this!)







Tuesday, January 5, 2016

Freshman Migration, 2014

Note: Please be sure to hover over a bar on the first chart, below, to see how to interact.

When I was a kid, I was fascinated by license plates on cars.  And whenever I found myself in a college parking lot, it was like a buffet, with lots of plates from distant states. Thus, my fascination with freshman migration and out-of-state enrollment was born.

IPEDS has finally released 2014 Fall enrollment data, and that means the bi-annual availability of the freshman migration data.  I like visualizing this for two reasons: First, I can think of dozens of ways to show it, all of them interesting to me, and maybe to you, too.  Second, the data is so multi-faceted that it requires viewers to interact, something I've preached about for years: Don't let me decide which data is interesting; decide for yourself how you want to view it.

There are two ways I've presented it here.  On the first, you start by looking at the states that exported the most freshmen in 2014.  If you want to look at the colleges those students attended, just click on the state bar in the top view; the destination colleges below update.  If you only want to look at liberal arts colleges, or colleges in the Southeast, you can do that using the filters.  Remember, there is a reset button at the bottom of the visualization.  You can't break it.

The second view shows individual colleges; I've started with my own.  You can see where the 2014 class came from.  First select All students or just Out-of-state students, depending on your interest. The tree map updates to show both regions (in color) and states (each square).  Hover for details.

What do you see? And do you like this as much as I do?

(As always, data presented is believed to be accurate, but there are occasional problems with IPEDS, including incorrect and missing data).






Tuesday, November 17, 2015

How will demographics change enrollment?

Ever since I started in admissions, people have been talking about demographics changes and challenges, and the chant continues.  The future, we're told, will look very different than the present.

Our trade paper, the Chronicle of Higher Education, ran an article about how this might affect higher education.  It included lots of interesting charts and graphs, but didn't allow me to look at the data in the ways I wanted to.  So I downloaded it and started looking at it using Tableau.

This is as much a testament to self-service BI as it is to the trends in the data.  I've often spoken about the 80/80 rule of business intelligence: 80% of what an analyst gives you, you don't need; 80% of what you want isn't in the report.  I spent a long time playing with and slicing this data to see if I could find a way to present it that makes sense, and that gives people what they want.  And every time I answered a question, I generated several more ("what if" can waste a lot of time.")

In the end, after several different views, I settled on the first one, below.  It's very simple, yet it gives you the flexibility find out most of what you need.

On the chance that you want or need something else, though, I kept the other views I had been experimenting with.

View 2: Maps and Details allows you to see the data mapped; once you filter to a region, you can see how states compare.

View 3: Changes with a State over Time looks at the same data four ways: Numbers, percent change, percent of total, and numeric change by ethnicity.

View 4: Counties Mapped allows you to select a state and see where concentrations of ethnicities live; choose a state, choose the ethnic group and age of the population, and see the results.

View 5: States and Counties shows ethnic percentages for every county, listed by state.

View 6: Counties shows all counties regardless of state.  Did you know there are 40 counties in the US where every 18-year old is white? Or that one county in South Dakota is 98% Native American?

Some notes about the data are on the CHE website.  Be sure to read them so you know what this shows and doesn't show.

Again, remember to interact.  You can't break anything.

And if the frame is not displaying the visualization correctly, you can go right to the original on the Tableau Public website.





Tuesday, September 29, 2015

The Pell Partnership Data

Yesterday's big news, of course, was the announcement of "The Coalition," the curiously-named group of about 80 colleges and universities making for very strange bedfellows.  I wrote a little bit about it here.

Today I came across a little data set that contained information about Pell graduation rates and non-Pell graduation rates, and I thought it an interesting opportunity to look at it in light of yesterday's news.  So, over my lunch hour, I did (yes, this software is really that easy to use. You should try it out.)

It's presented here in Tableau Story Points.  Just use the gray boxes across the top to look at the different views of the data.  Most of it should be self-explanatory, but if not, leave a comment and I'll reply to it.

FYI, there were several schools from "The Coalition" who did not supply Pell Grant Grad Rate data. In alpha order, they are:


  • Columbia (NY)
  • Hamilton (NY)
  • Harvard (MA)
  • Rutgers (NJ)
And you can ask them why they didn't.  I would never speculate about such things. (Olin College of Engineering did not provide data claiming their sample size was too small.)







Monday, September 28, 2015

The Peacekeeper Missile Comes to Admissions

Maybe  you're too young to remember the Ronald Reagan presidency, but one of the things I remember most is the "Peacekeeper Missile." People were incensed by what they believed to be political doublespeak worthy of the book 1984.  Missiles were objects of destruction, not something you associated with peace.  Change the language, change the discussion.

So today, this happened.  In what Inside Higher Ed is calling "An Admissions Revolution," eighty of the country's top colleges have formed a "Coalition," (a nice political sounding word: I mean, they form coalitions in Canada, so it must be nice, right?) to create a new application as well as a new portfolio system for students, who can start as early as the 9th grade, to assemble documents and other resources, not unlike my suggestion about Google managing the application process.  The goal, ostensibly, is to get more low-income and first generation students interested and ready to go to college, and to apply to these mostly-selective institutions.

This sounds great, right? Right?  You'd think so.

Of course, if you know anything about college admissions, your first question might be this: Today, one day after the announcement, which group is probably more aware of The Coalition?  A) first generation, low-income, students of color from under-resourced high schools, or B) white students of wealthier, college-educated families who already being planning for college at--or well before--the 9th grade.  I'll give you a moment.

In an industry already obsessed with prestige, this sounds like a club that won't take just anyone as a member, unlike the Common App, which has recently--God help us all--begun to allow colleges to determine for themselves what admissions criteria are important.

The collective gasp from the super selective members of Common App sounded like a Rockefeller in the presence of someone who extended the wrong pinkie finger when drinking tea.  "We just can't have these, these, Commoners, in the Common App," they decided without discerning a hint of irony, and they started their own country club, which of course, will do the requisite charity work one expects of any decent country club.

The standards for membership are fairly arbitrary: A 70% graduation rate for all members; for privates, a pledge to meet "demonstrated need," (a patently ridiculous term both in definition and in the way it's practiced) and for publics, "affordable tuition for and need-based aid for in-state students."

Does that seem backwards to you?  Shouldn't public institutions, which I believe were generally founded by the public for the public, be held to a higher standard of serving, you know, the public they're supposed to serve?  And of course, remember my frequent rant that high graduation rates are an input, not an output.  Even as blunt an instrument as US News and World report recognizes that if you enroll more Pell grant recipients, your graduation rate will drop.

Which brings me to the last point.  These institutions are, for the most part, selected from the institutions that a) have the most resources, b) charge the most, and c) enroll the fewest Pell grant kids.  Is this new application, which fragments the process even further, and clearly--not even possibly, but clearly--favors wealthier kids really the answer?

Or is the name--The Coalition for Access, Affordability and Success--just a political ploy from institutions that don't really seem to know much about access in the first place?  A new take on the Peacekeeper Missile? An homage to 1984?

Look at this, showing about 1700 four-year private and public institutions, each as a bubble.  The Coalition institutions are in red, everyone else in gray.  Colleges to the right have higher median SAT scores in the freshman class (another proxy for wealth, of course); colleges lower on the chart have fewer Pell grant kids as a percentage of all freshmen.  Larger dots are wealthier.  Hover over any dot for details about that college.

The the two-bar chart on the top shows Pell Grant enrollment.

There is one filter, to allow you to look at all institutions, just public, or just private.  Go ahead, click. See if it makes much difference.  And remember:

“War is peace. 
Freedom is slavery. 
Ignorance is strength.”


“It's a beautiful thing, the destruction of words.” 




Tuesday, August 18, 2015

How Pell Grant Recipients Fare at America's 80 Largest Universities

On my train ride in this morning, I saw an article posted on Twitter about Pell Graduation rates at the 80 largest universities in America.  If you want to look at a boring table of static data, just click here.

But I wanted to see if there were any patterns, so I copied the table, pasted it into Excel and then opened in Tableau to visualize it.  I think it tells an interesting story, although the data set is unfortunately limited, and with no key to merge the data with another set, it loses some potential.

Start by looking at the first view.  For each institution, there are three columns: The overall six-year graduation rate; the six-year graduation rate of Pell recipients, and the spread, with the values on spread sorted from low to high.  In this instance, a negative number means Pell students graduate at a higher rate than the student body overall, and a positive number means just the opposite.  As you scroll down the list from top to bottom, ask yourself what makes the pattern make sense?  There are dozens, but all I could see was, "football," but you might see "big public research university."  Or something else all together.

If you want to sort by another column, hover over the axis until the little icon pops up and click away. The "reset" at lower left does just what it says it does.

The second view (on the tabs across the top) shows the Pell graduation rate scattered against the percentage of freshmen with Pell.  The bubbles are colored and sized by spread (blue and large are good for Pell students; red and small, not so much.)  Right away you see the pattern: If you enroll fewer Pell students, your Pell graduation rate is higher.  My hypothesis would be that more selective institutions (who have higher graduation rates overall) a) simply select the most capable from among the poor students they admit, and b) have more resources to fund the smaller percentage of low-income students.

What do you see?



Wednesday, August 12, 2015

Watch Out, Guys

Women have made tremendous strides in educational attainment of bachelor's degrees in the last half of 20th century and the first decade of the 21st.  And even though doctoral degrees have lagged behind, we can see dramatic changes there as well.

Take a look at this visualization using National Science Foundation Data (this link downloads the data for you in Excel as Table 14).  What you see over time is a dramatic increase in the number of women who earned doctorates since 1983, but also a shift in the percentage distributions. Women are now the majority in Life Sciences, Education, and Social Sciences, and close to dead even with men in all fields except Physical Sciences and Engineering.

The second view (using the tabs across the top) shows doctorate by broad discipline over time.  Use the filter at the top to compare men and women, or to see the totals.  Note the tremendous percentage growth in women in engineering since 1983: From 124 to 2,051, an increase of over 1,500%.

While it's not necessarily true that most doctoral recipients work in higher education, it's true that higher education gets most of its instructional faculty from doctoral recipients; the long, slow trend (assuming it will continue, or even just stabilize) means there are some interesting changes in store in the higher education labor force in the coming decades.  It's possible college faculty will look very different 20 years from now

What do you think?

P.S. You might also be interested in this, showing bachelor's attainment over time.



Tuesday, July 21, 2015

What Happens to 100 9th Grade Students in Your State?

While waiting for 2014 IPEDS data to come out, I've been searching the web for more good educational data to visualize, and came across this site, where I found a nice little data set.  It's from 2010, and tracks 9th graders through high school and college.

We typically think of looking at high school graduates and measuring how well they do, which is important, of course.  But you can have a high percentage of graduates enrolling in or graduating from college masking a problem of high school dropouts.  This data helps look at that.

For all the data here, assume you start with 100 students in 9th grade in the state:


  • What percentage of them graduate from high school?
  • What percentage of them enter college?
  • What percentage make it to the sophomore year of college?
  • What percentage graduate from college within 150% of normal time (in other words, within six years)?
Finally, there is another, more traditional measure included: The percentage of high school graduates who graduate from college.

The data are interesting by themselves, but I also rolled in census data of median family income by state in 2001, presumably the year the 9th grader tracking began.  It's by no means perfect: New York City and Elmira in New York, for instance; Dallas and Colorado City in Texas; or Hollywood and Fresno in California share very little except a state capitol. I've made no adjustments for purchasing power of a dollar, either.  The high incomes in Alaska mask a much higher cost of living, and the remoteness of the state and relative dearth of post-secondary options make its attainment rating skew low, in all probability.

  • On the first view, the map, hover over any state to get a popup chart.  Go to the top left corner of the 48 States map to zoom; resets are at the lower left of the visualization.  The states are colored by the percentage of high school graduates who earn a college degree.
  • One the second view, the scatter gram, the x-axis is always the rank of median family income. Choose any other value to plot on the y-axis.  The states are colored by region, and you should note that the axes are reversed, so a rank of 1 is high and to the right.
  • And, of the third view, a slope graph, where you can compare any two measures of educational attainment in the states by using the right and left controls.  The line connects the two ranks.
What do you see here? I'd love to hear your thoughts.




Thursday, July 16, 2015

Where did you go to college?

Many people in higher education are fascinated with prestige, whether we like to admit it or not.  The question, "Where did you go to college?" can carry a lot of weight in job interviews or even casual conversation as people get acquainted.

The National Science Foundation annually publishes data telling us the colleges that produce the most alumni who go on to earn a doctorate from a US institution in a given year.  It's not a great data set in itself, and some brave soul will take IPEDS degree data and merge it to show which of these institutions are the most efficient producer by discipline, but that's not what you'll find here.

On this visualization, any time you see a college listed (UCLA, for instance), it shows how many bachelor's graduates of that institution earned a doctorate in 2012.  It's not the university that awarded the doctorate; that could be anywhere in the US (The University of Texas, or Stanford, for instance.)

There is some interesting stuff here, even if you just stick to the first visualization, where you can choose a broad or specific field, and see which institution produces the most alumni who earn a doctorate. I've sorted them by Carnegie Type, so that Carleton, for instance, doesn't have its accomplishments diminished by the big research institutions.

One thing that jumped out is the surprisingly high percentage of doctorates in engineering earned by graduates of foreign colleges and universities.

What else do you see?



Wednesday, July 1, 2015

Tuition Transparency Ratings

The Federal Government released its Tuition Transparency Ratings today, to help students and parents find out how fast colleges are raising tuition and net price.  And as is the case with many well-meaning government programs, the data doesn't always tell you the whole story.

The top chart on this visualization show tuition and fees at about 6,000 colleges and universities; the light blue bar is 2011, and the orange square is 2013.  To the right is the two-year percentage increase.  If you want to limit your selections or sort the colleges differently, take a look at this image, which I've embellished with some instructions.  Click to view larger.


The second chart, at the bottom, shows net price for 2010 and 2011.  Net price is calculated after grant aid, which is only reported at the end of the year, which explains the delay.  It's pretty much the same: 2010 on the aqua bar, 2012 on the red dot, and percent change in the purple circle.  The filters and sorts work the same way on this one.

There are a couple of problems here: One is the data.  I could not find a single program on the New England Culinary Institute website that listed a tuition of $88,000, but that's the data shown here. There are several instances like that in this data; even if they are technically accurate because of the way a program is configured, it doesn't advance our understanding of the issue much.

But more important, net cost is a function of who enrolls and how much aid you can give: If you suddenly stopped enrolling middle-income students, or you have small enrollments, the results can be very volatile. Net cost is a remnant, not a target that can be tightly controlled.  And, it seems in many instances net cost is being calculated by different people in different ways over the two-year period.

Still, there is some good stuff here, I think.  Take a look and let me know.




Tuesday, June 23, 2015

Looking at Medical School Admissions

Most of the things I look at have to do with publicly available data sets, and that often means undergraduate admissions.  But while doing some investigation, I came across data from the American Association of Medical Colleges.  There's some interesting stuff there, and while it's formatted in a way that makes it really difficult to get to, it's worth a little work.  (I'm not convinced that the formatting isn't an attempt to keep less stubborn people from digging too deep on this; my request to get the data in a better format was ignored.)

Best thing I learned: In 2014, of the 49,480 applicants to medical school, 41.1%, or 20,343, enrolled. That's a far higher percentage than I would have thought, although it is lower than the 2003 rate of 47.5% (34,791 and 16,541, respectively.)  It's clear, of course, that most medical school applicants are very well qualified, so that number represents the best of the best, but the perception of medical school selectivity is driven by the rates at each individual institution (sometimes 5% or less); in fact, each student applies, on average, to about 15 medical colleges, which skew the numbers.  These numbers are just for M.D. admissions, not D.O. or other medical professions.

This visualization has seven views, and starts with an intro.  You can get to the other six by clicking the tabs across the top:


  • A scatter, showing each medical college, colored by region, on two scales: Total applications and the number of applications per seat
  • Historical data for MCAT and GPA performance for applicants and matriculants over time
  • Applications, by ethnicity.  These are in a heat map format; the orange squares represent the highest vales on that individual grid
  • Admit rates, by ethnicity.  This represents (I'm 99% sure) the chance that a student in the category show, represented by the intersection of column and row, was admitted to at least one of the schools she applied to
  • Applications per seat in the entering class, broken out by male, female, and in-state status
  • Matriculant diversity, shown as male/female and in-state/out-of-state
By the way, if you need some understanding of MCAT scores, you can see them by clicking here.

If you're like me, you have a lot of questions that are not answered by the data AAMC provides.  But it's still a good start.  What do you notice here?





Monday, June 8, 2015

Diversity of Institutions, by Type

A few posts ago, I wrote about where students of certain ethnicities went to college.  In other words, if you looked at all the Hispanic students in the US, we'd want to see where they go to college, and compare that to Asian students, or students of two or more races.  I asked whether a student's ethnicity determined where they go to college.

This is the same data, but it examines it at the other end: The colleges, and how diverse they are.  In other words, does your location, control, and size, and Carnegie Type, for instance, determine how diverse you are, or limit how diverse you can become?

Again, the answer is no, but you can find some interesting trends.

If you're timid about using Tableau and interacting with it, here's your chance.


  • First, choose an Ethnicity in the top left corner.  For instance, assume you want to display the percentage of enrollment that is Asian.
  • Then, choose what value you want to display along the y-axis (the left side, from top to bottom)
  • Choose how to display the x-axis using two controls.  If you want just one dimension along the x-axis, make it the same variable for both x-axis controls.
Using the default values, look at the top right box.  This means that at the two private-for-profit Doctoral/Research Institutions in the Western States, the undergraduate enrollment is 81.6% non-white.  Hover over the box for details.


Now, click on that box, and the bar charts at the bottom update to show you those three schools, and the percentage of the student body of the ethnicity indicated.


As always, if you get stuck, just use the undo or reset buttons at the lower left:


There is a LOT to play with here.  What do you notice?





Monday, June 1, 2015

Enrollment at Women's Colleges, 2005 to 2013

Note: I got an email from Dean Kilgore at Mount Saint Mary's in California, who indicated I'd downloaded data for the wrong Mount Saint Mary College:In this case, the one in New York. I had to create the list manually, and it was just a slip on my part.

Sorry about that. I've removed them from the analysis, but unfortunately, can't add the correct one at this time without a considerable amount of work.



Sweet Briar College in Virginia recently announced, to the shock of many in higher education, that it would be closing at the end of this spring, 2015 term.  As often happens when a college decides to close, those who are or were close to it rally the troops and wage a fierce campaign to try to keep it open.  Sometimes it works, other times, it doesn't.

The scene playing out is not unusual: Allegations of secret deals, incompetence, blindness to all that is and was good at Sweet Briar. This is what happens when you decide to close a college.  And although I'm not taking sides, I did write before that the closing does seem to be curious in light of what little publicly available financial data there is: If you had to pick a college from this list that was going to close, it probably wouldn't be Sweet Briar.  Even the federal rankings of financial responsibility gave Sweet Briar a 3, a score higher than Harvard, which may only point out how absurd those ratings are in the first place.

A while ago, I downloaded a pretty extensive data set, using the members of the Women's College Coalition as my base.  Not all colleges have data available in IPEDS, however, so I did the best I could (for instance, the Women's College at Rutgers is not in IPEDS as a separate institution, or if it is, I couldn't find it.  And I took out Saint Mary of the Woods, as they just announced they're going co-ed).  Also, since there is no IPEDS data field that tells you when a college is a women's college, I couldn't go back and find out how many were labeled as such 20 years ago.  That might have been interesting.

Overall, though, the data were pretty uninteresting.  So I gave up on visualizing it.  There were trends, of course, but nothing dramatic.

So, when I saw this article, by one of the people leading the charge on the Save Sweet Briar campaign, one sentence jumped out at me:

Enrollment: There is no evidence that enrollment is declining, either at Sweet Briar or at women’s or liberal arts colleges. This claim is simply false. Numbers people, please check for yourself: The data are publicly available.

The data are available, and the link goes to the IPEDS site I use all the time.  So, take a look here. There are five views of the data, using the tabs across the top.  The first shows changes in freshman, total, undergraduate, and graduate enrollment over time.  The changes on the right are shown in relation to the prior year.  The second shows the same data, but the change is cumulative since 2005: As you can see, total undergraduate enrollment is down almost 6% during a time enrollment increased nationally.  The third shows admissions activity; the fourth breaks it out, showing Sweet Briar and all the other women's colleges in aggregate.  And the fifth shows total undergraduate enrollment in 2005 and 2013 (on the left) and change (on the right.)  As you can see, there are some big winners, big losers and a lot of small changes.

Decide for yourself.  And tell me what you see:





Wednesday, May 27, 2015

Does Ethnicity Determine Where You Go to College?

The answer to the headline, of course, is "no."  Race is not determinant of where you go to college, but race--or more probably the factors that vary with race and ethnicity--may influence your college choice set, which can, of course, influence where you go to college.

I've written before about how all these variables are at play with each other: In America, race, income, parental attainment, and presumably, opportunity, all cluster together.

And, after you look at this, you'll see how opportunity gets distributed by race, provided you're willing to click a button or two.  The visualization starts off showing all undergraduate enrollment in almost 7,000 post-secondary institutions who report to IPEDS.  (And before you object, I've believe strongly that the college you attend is not your destiny, and that education is what you make of it, as I've written before on my other blog. But it's also clear that many people believe talent congregates at the "best" colleges, and the way this plays out in hiring and graduation school admissions can be troublesome.)

As you can see, almost 75% of all undergraduates go to a public institution; the majority of them go to Associate's granting institutions (almost all community colleges.)

But use the control at the top right to see how the distribution changes: Try Black or Hispanic or Asian or White to watch the bars move.  What you see is a change: Hispanic and African-American students go to community colleges and for-profits at much higher rates than their White and Asian peers.  White students go to private, not-for-profits at higher rates than almost any group, except Non-residents.

International (Non-resident, here) students flock to research universities, but are also far more likely than any group to attend a private university.  This is because they avoid for-profits in great numbers.

What else do you notice?

If you want to limit the population, feel free to use any of the filters, but beware that the percent of totals are taken on the base of the sub-population you've chosen, not the entire population.  You'll notice there are further differences by geography and campus urbanization among other things. I'd love to hear what you turn up that's intriguing.