Educational Attainment, Achievement, Credentials, and Skills Data
This discussion forum focuses on educational attainment, achievement, credentials, and skills data.
Ideas, questions or requests for information may be posed to other members of this Community of Practice, one or more of whom may be able to help you or respond with their own ideas.
User Comments (27)
The Lumina Foundation today released “A Stronger Nation through Higher Education,” a lengthy study which includes educational attainment data for individuals aged 25-64 for each state, county, and the 100 largest metropolitan areas.
The data come from the U.S. Census Bureau’s American Community Survey, and utilize data from 2010, 2008-2010, or 2006-2010 — for progressively smaller geographical areas, the study combines data for more years. Except at the state level (which show seven levels of educational attainment), the study provides no breakdowns beyond whether individuals possess at least an Associate’s degree.
After a nationwide overview, this 130 page has a chapter for each state.
The full study can be accessed at A Stronger Nation through Higher Education.
A private researcher has mined preliminary federal data from the 2010-11 academic year to provide a much timelier picture than the U.S. National Center for Education Statistics (NCES) does of how many Americans are earning postsecondary credentials.
The data are available at http://chartsdemo.cloonware.com/plots/a/demo/degrees/ — in a site that is set up as a customized search tool to allow the user to select the type of degree (from Associate’s degree through Doctorate/Professional degree); national or state data; and whether the degrees emanate from public, private for-profit, or private not-for-profit institutions.
The background to this new tool is well explained in this news article: Researcher Creates New Tool. The researcher has previously argued that — since NCES typically makes relatively modest changes between the preliminary and the final data — the U.S. Education Department should issue the preliminary data (Timelier College Completion Data).
The U.S. Bureau of Economic Analysis follows this model in its three issuances of GDP data for a given quarter.
Urban Institute researchers have estimated job growth by educational attainment through 2017 for the nation, 25 states, and 22 metropolitan areas. Both industry on occupational employment growth estimates are provided. The authors have also analyzed unemployment rates by various demographic characteristics plus by citizenship, whether an individual’s primarily language is not English, by whether they possess a physical or mental condition that limits their work.
Their estimated net employment growth from 2012 to 2017 does not differ greatly by educational attainment level, varying only from a low of 9.6 to a high of 11.5 percent (p. 14). However, the authors emphasize that this estimate should be placed in the context of significantly higher job loss rates among those with less educational attainment during the past five years. For example, they forecast that for individuals with advanced degrees will regain their 2007 employment level during this year, and those with Bachelor’s degrees by 2013, but for those with high school diplomas or less this will not occur until 2014 (p. 15).
Data for 25 states are shown in Appendix pp. 10-13, and those for 22 metropolitan areas are shown in the link below. Data for different demographic groups and by work barriers (such as language or disability) are shown on pp. 19-20.
This 36 pp. report (not including metropolitan-specific reports) is available at A National Picture of Short-term Employment Growth by Skill.
The full citation is Pamela Loprest and Josh Mitchell, Labor Market and Demographic Analysis: A National Picture of Short-term Employment Growth by Skill (Washington: Urban Institute, May 2012).
Readers should bear in mind several caveats about the report. First, the authors use educational attainment levels from 2007 as a proxy for skills (the last pre-recession year is used in an attempt to control for the tendency for recessions causing workers with higher educational attainment levels to take lower-skilled jobs). No use is made of the new BLS education and training classification system. Employment projections are based upon data from Moody’s Analytics, not the BLS projections (which assume that a full employment economy will exist in the projected year).
On July 26, 2012, the U.S. Education Department issued its first data under the new Gainful Employment regulations, showing that five percent of postsecondary education programs — all located at for-profit colleges — did not meet any of three key requirements. As explained below, two of the requirements take into account the post-program earnings of enrollees. Under the Gainful Employment regulations, career training programs will continue to qualify for federal student aid if they meet one of the following three metrics in at least three out of four consecutive years.
• Loan Repayment Rate: At least 35 percent of the program's former students are repaying their loans;
• Debt-to-Earnings Annual Ratio: The estimated annual loan payment of a typical graduate does not exceed 12 percent of his or her total earnings;
• Debt-to-Discretionary-Earnings Ratio: The estimated annual loan payment of a typical graduate does not exceed 30 percent of his or her discretionary income. [After the Department's announcement, this criteria was blocked by a court ruling.]
The Department conducted a Webinar shortly after issuing the news release, and indicated that it would later post the Webinar online.
For more information, see http://www.ed.gov/news/press-releases/five-percent-career-training-programs-risk-losing-access-federal-funds-35-percen
The Inside Higher Ed newsletter posted an article on the announcement, at http://www.insidehighered.com/news/2012/06/26/education-department-releases-data-gainful-employment-rule
The U.S. Education Dept.’s National Center for Education Statistics has issued “Trends Among Young Adults Over Three Decades, 1974-2006,” based upon data from four separate longitudinal surveys. The study tracked the experience two years later of those who were high school seniors in their Spring semester.
Over the three decades, the proportion of seniors enrolled in postsecondary courses two years later rose from 40 to 62 percent, while the proportion working at that time correspondingly fell, from 48 to 28 percent. Most of this change had occurred by the 1990’s, as relatively little change occurred between the last two cohorts of seniors examined. Nonetheless, in every cohort more than 90 percent of the seniors had worked at some point in the intervening two years. Moreover, the proportion who enrolled in school and simultaneously worked rose over the period, from 63 to 78 percent. Thus, these youth showed a strong attachment to the labor market despite rising enrollment in school.
Note that the study excluded those who dropped out of high school before the Spring of their senior year (see below for a corresponding study of dropouts).
The links for the new study and two previously-released companion longitudinal studies of dropouts and high school seniors in their senior year are shown below.
1. Trends Among Young Adults Over Three Decades, 1974-2006
2. Late High School Dropouts: Characteristics, Experiences, and Changes Across Cohorts
3. Trends Among High School Seniors, 1972-2004
BLS has released the results from its latest re-survey of individuals first interviewed in 1979 when they were between 14 and 22 years old. At the time they were re-interviewed for these new data (2010-11), they were between 45 and 53.
Individuals held an average of 11.3 jobs between age 18 and 46; nearly half of these jobs occurred before age 25. Investment in education has a large and enduring impact. Those with less than a high school education were employed only 60 percent of the time by age 46, compared with 82 percent of those with a Bachelor’s degree or higher.
Even in their 40’s, the most educated individuals continued to garner increases in inflation-adjusted earnings (1.6 percent annually), while those with less than high school experienced only a .2 percent annual growth.
Comparing the largest racial and ethnic groups, Whites were employed the greatest percentage of the time (79.9 percent of the time between age 18 and 46), followed by Hispanics (72.5 percent) and African Americans (68.7 percent).
The BLS news release, “Number of Jobs, Labor Market Experience, and Earnings Growth Among the Youngest Baby Boomers: Results from a Longitudinal Survey,” is available at Number of Jobs, Labor Market Experience, and Earnings Growth Among the Youngest Baby Boomers
The U.S. National Center for Education Statistics has issued its preliminary data on Fall 2011 postsecondary staffing patterns and 2010-11 attendance costs (both before and after aid). This brief report (2 tables), with national statistics only, includes broad occupational categories for both instructional and non-instructional staff for 4-year, 2-year, and less than 2-year schools by whether they were public, private non-profit, and private for-profit.
The attendance cost data are presented for the same types of schools, and also indicate the cost for families at various income levels as well as the average grant or scholarship aid that students received. Among full-time, first time degree/certificate-seeking undergraduate students receiving any grant aid, those attending public 4-year institutions, average price before aid was approximately $17,600 and net price was about $11,000; for those attending nonprofit 4-year institutions, average price before aid was roughly $34,000 and net price was about $19,800; and for those attending for-profit 4-year institutions, average price before aid was approximately $27,900 and net price was about $22,500.
"Employees in Postsecondary Institutions, Fall 2011 and Student Financial Aid, Academic Year 2010–11 - First Look (Preliminary Data)" is available at Employees in Postsecondary Institutions, Fall 2011 and Student Financial Aid, Academic Year 2010–11
The National Research Council has issued a summary of a January 2012 workshop, Key National Education Indicators (Washington: National Academies Press, 2012).
The 107 pp. report (which can be downloaded for free by registering for the site) is available at
Key National Education Indicators
Employment-related indicators are discussed in Chapters 4 (Indicators for Higher Education) and 5 (Indicators for Adult Postsecondary Education and Training). The report includes a review of which indicators would be desirable, and the challenges and problems in collecting these data.
The U.S. National Center for Education Statistics (NCES) has released a Congressionally-mandated report analyzing problems in postsecondary educational attainment by gender, race and Hispanic origin. This 300 page study makes use of a broad variety of governmental and private sector surveys, and in addition to the demographic groups above examines Whites, African Americans, Asian Americans, American Indians/Alaska Natives, and Native Hawaiians and other Pacific Islanders.
Chapter 7 covers employment and earnings outcomes, including a section on the employment of individuals possessing science, technology, engineering, and math (STEM) degrees. For outcome results, the study principally focuses on individuals who are 25 to 34 years old.
The report analyzes numerous factors that might play a role in educational attainment problems, including poverty and parental background; school characteristics (including counseling); student activities and behavioral problems; coursework; postsecondary expectations, access to information, and financial issue; and the experiences and problems of individuals once enrolled in postsecondary education.
Access the study at Higher Education: Gaps in Access and Persistence Study (Washington: U.S. National Center for Education Statistics, August 2012), NCES 2012-046.
The U.S. National Center for Education Statistics has released “Writing 2011,” its assessment of the written proficiencies of 8th and 12th grade students. This report is part of the widely respected National Assessment of Educational Progress (NAEP) series.
Since national writing tests are rarely performed for the adult population, this assessment of high school seniors provides the closest approximation of adult skills in this area. NAEP conducted previous writing assessments in 1998, 2002 and 2007, but unfortunately the newly-designed 2011 assessment cannot be compared to these earlier reports. The sample included just over 28,000 12th graders. NCES only reports national results from this survey.
Scoring for the assessment is based on a scale ranging from 0 to 300, with the average by definition set at 150. The assessment ranked students at three levels: basic, proficient and advanced. Among seniors, the proportions scoring about at or above each of the three levels were 79, 27, and 3 percent, respectively.
Data are presented for the usual demographic categories, as well as for Asian Americans and American Indians/Alaska Natives. The report also examines the results by parental educational attainment; city, suburb, town or rural school; amount of writing homework; and computer usage.
See the U.S. Institute of Education Sciences, National Center for Education Statistics, Writing 2011: National Assessment of Educational Progress at Grades 8 and 12 (Washington: NCES, September 2012), NCES 2012-470.
The U.S. Bureau of Labor Statistics (BLS) has just released ”A comparison of college attendance and high school coursework from two cohorts of youth,” BLS Special Studies & Research, Volume 1, Number 14, October 2012. Comparing two age cohorts of high school students, from the latter 1970s/early 1980s vs. the latter 1990s/early 2000s, the younger students took much more demanding coursework in math, science and foreign languages. This, combined with individuals’ rising propensity to pursue postsecondary education, suggests a workforce that is better skilled than formerly.
The author, using high school transcript data (available for about 70 percent of the sample), chose these subject areas because the progression of coursework difficulty is more easily measured than for other courses. Math and science were each divided into five levels of difficulty, while foreign language was divided into three levels — in all cases the lowest level represented no coursework taken in the subject.
The historical differences were dramatic (pp. 2-4). The proportion taking math at the upper two levels (the highest level being calculus) rose from 10 to 35 percent between the older and younger cohorts (p. 2). Similarly, the proportion that took the highest level science courses (chemistry and physics) rose from 3 to 27 percent. The share who took two or more years of a foreign language rose from 30 percent to slightly more than half.
Note that transcript comparison data are also available from the U.S. Department of Education’s various longitudinal studies. For the relevant links, see Links to Longitudinal Surveys.
Drop in Remedial Courses in the 2000s
The U.S. National Center for Education Statistics (NCES) has issued the latest in its series of surveys (for the 2007-8 school year) on the proportion of first-year postsecondary undergraduates who report enrolling in remedial (also known as developmental) courses. The percentage of first-year students who enrolled in remedial courses was a little less than 30 percent in both the 1995 and 2000 surveys, a proportion which fell to the 19-20 percent range in both the 2003-4 and the latest survey (there was generally a very slight increase between the last two surveys).
However, the implications of this decline are not at all clear, as the survey does not attempt to measure the need for remediation (nor whether the courses were completed), and various factors could explain the findings. Therefore, the survey sheds no direct light on the skill level of postsecondary students.
Comparing the 2000 data with the latest results, there was a general decline in remedial course-taking across the racial groups, by Hispanic origin, by gender and age, and 2 vs. 4-year public institutions. The decline was especially pronounced among schools which had either open admission or minimally selective enrollment policies. Only among private schools was there less than consistent evidence of declining remediation.
For the full report, see U.S. NCES Statistics in Brief, “First-Year Undergraduate Remedial Coursetaking: 1999–2000, 2003–04, 2007–08,” January 2013 (NCES 2013-013), 12 pp.
Employment and Education by Age 25
The U.S. Bureau of Labor Statistics (BLS) has issued the latest data from its National Longitudinal Survey of Youth 1997 (NLSY97), showing dramatic disparities by age 25 in employment histories between the most and least educated young adults. Almost all of those who’d completed college were employed (93 percent) during October at the time they were 25, compared with only 60 percent of those who hadn’t finished high school. Although unemployment rates were more than twice as high for the latter group compared to the former, the greatest difference was in the likelihood of not being in the labor force at all: 33 vs. 5 percent for high school dropouts vs. college graduates, respectively. In fact, by their 26th birthday, 5 percent of youths who had not received a high school diploma had never held a job since age 18.
Despite their greater availability for work because they hadn’t attended postsecondary school, high school dropouts only worked about half the total time (54 percent) between ages 18 and 25, compared with nearly three-quarters (73 percent) of the time for those with at least a Bachelor’s degree.
By age 25, 26 percent of total young adults had obtained a Bachelor’s degree or more, and another 14 percent were still in college. Just over half (51 percent) had ceased their education with a high school diploma (8 percent via a GED), and the remaining 9 percent hadn’t completed high school. Nearly a third (30 percent) of Whites had received a Bachelor’s degree or more by age 25, compared with only 14 percent of African Americans and 12 percent of Hispanics. Correspondingly, African Americans and Hispanics were about twice as likely as Whites to not complete high school by their mid-twenties.
By age 25, individuals had held an average of 6.3 jobs.
See America’s Youth at 25.
NLSY97 is a nationally representative survey of about 9,000 men and women who were born during the years 1980 to 1984. These respondents were ages 12 to 17 when first interviewed in 1997, and ages 25 to 31 when interviewed for the 14th time in 2010-11. The survey provides information on work and nonwork experiences, training, schooling, income, assets, and other characteristics. For further information, see the NLSY97 Web site.
12th Graders’ Economics Knowledge
The U.S. National Center for Education Statistics (NCES) released its assessment of economics knowledge among 12th graders for 2012, and compared the results with its first assessment of this subject in 2006. Overall, more than half of students scored at a less than proficient level. The average 2012 score was not significantly different from that of 2006, but lower-scoring students and Hispanic students did experience small but significantly higher scores over this period.
The assessment covered three economics areas: the market economy (i.e., microeconomics) plus the national and international economies. Nearly 11,000 students participated from both public and private high schools. Part of the National Assessment of Educational Progress (NAEP), such tests of individuals on the threshold of adulthood are important because topic-specific assessments of adult knowledge are rare — most of our knowledge on this subject derives from the approximately once per decade general-purpose literacy and numeracy assessments. The economics assessment was not designed to provide estimates at the state or local levels.
Lower-performing students significantly raised their scores between 2006 and 2012: both those scoring at the 10th percentile and those scoring at the “below basic” level (scores were placed into one of four categories: below basic, basic, proficient, and advanced). None of racial/ethnic groups significantly increased their scores except Hispanics (data are included for American Indians and Asian Americans, in addition to the races more commonly covered).
The largest score gaps (in decreasing order) were between English-as-a-second-language (ESL) learners vs. other students (52 point differential); students with vs. without disabilities (34 points); White/Asian American vs. African American students (29 points); students whose parents graduated from college vs. those who didn't finish high school (27 points); private vs. public school students (16 points); and boys vs. girls (6 points).
See Economics 2012 for the full report.
Should You Go To College?
The Brookings Institution has issued a useful briefing paper summarizing recent research on the value of a Bachelor’s degree, concluding that although on average higher education unquestionably pays off, the return on investment for postsecondary education varies widely and is in some cases negative. The educational major, occupation, and cost and selectivity of the school matter greatly.
See the Should Everyone Go to College?.
The inflation-adjusted earnings gap between Bachelor’s degree holders and those with a high school diploma only widens until the mid-40s, then remains roughly stable until the early 50s before narrowing. However, the Bachelor’s degree earnings advantage remains large throughout individuals’ careers. One of the most important reasons for this pattern is that inflation-adjusted earnings for those with high school diplomas only grow through their early 30s, and then level off through their next two decades (while the earnings of their better-educated contemporaries continue to rise) before falling.
Factoring in the cost of a Bachelor’s degree, the return on investment is positive for every type of school, but not positive for all schools, and in fact almost 200 postsecondary institutions showed negative returns for their graduates. Sorting schools by their selectivity in admissions, public school outperformed private, not-for-profit ones.
The best-paid college major was engineering, followed by computers and math. The lowest paid major (with barely half the lifetime earnings of engineering graduates) was education, followed by the arts and psychology. The highest-earning occupational category was architecture and engineering, followed by computers, math, and management. The lowest-earning occupation for college graduates was in service — according to Census Bureau calculations, the lifetime earnings of an education or arts major working in the service sector were actually lower than the average lifetime earnings of a high school graduate.
The Brookings study is not based on original research, and only examines the differential between those with a high school diploma and a Bachelor’s degree. It doesn’t examine what is known about high school dropouts, Associate’s degrees, certification holders, those who attended but didn't complete college, and possessors of advanced degrees.
College Graduates’ Occupational Mismatches
A New York Federal Reserve Bank study shows a surprisingly weak correspondence between either a Bachelor’s degree and the field of study with the occupation in which individuals are later employed. Only 62 percent of Bachelor’s degree holders worked in occupations that required a Bachelor’s degree, and a mere 27 percent worked in occupations directly related to their college major.
The study examined the U.S. Census Bureau’s American Community Survey data from 2010, making use of recently added questions on the field of degree (the Bureau places degree holders into 1 of 171 detailed degree fields). Note that the study didn't examine all degree holders, but was restricted to 16-64 year-olds with a Bachelor’s degree (those with higher degrees were excluded) who lived in metropolitan areas (as the study focused on the impact of urban areas). However, since the vast majority of workers live in metropolitan areas, this shouldn’t affect the overall conclusions. Also, the data include those who were unemployed, although given the low unemployment rates among college graduates this would have only slightly increased the degree-to-occupation mismatch.
To ascertain whether a Bachelor’s degree was necessary for each occupation, the authors utilized the U.S. Employment and Training Administration’s Occupational Information Network (O*NET) survey, using as a threshold whether 50 percent or more of the respondents working in that occupation indicated that at least a Bachelor’s degree was necessary to perform the job (p. 6).
See Agglomeration and Job Matching among College Graduates (see Figure 1 in the appendix for the mismatch data).
4 Decades of Reading & Math Scores
The U.S. National Center for Education Statistics has issued its latest analysis of reading and math test results from the early 1970s to 2012, for the National Assessment of Educational Progress (NAEP, sometimes popularly referred to as “The Nation’s Report Card”). Unfortunately, for age 17 — the highest age examined — there has been no clear trend toward improvement over the past four decades in these two subjects.
See Trends in Academic Progress: Reading 1971-2012, Mathematics 1973-2012.
Since consistent national tests are only sporadically conducted for adults, NAEP tests of 17-year-olds and high school seniors are one of the best sources on the skills that adults are likely to possess. Note that NAEP’s grade-level tests (of grades 4, 8 and 12) are completely separate from its trend analyses of ages 9, 13 and 17. We focus here on the results for 17 year-olds, as gains among younger individuals have little lasting impact if they are not sustained for 17 year-olds.
The overall trend in reading scores for 17 year-olds (as is true of many NAEP reading subgroup trends) is progress from 1971 through 1988 (most of which occurred in the 1980s), declining scores by 2004 which undid all of the previous gains, followed by scores rising again after 2004 (a statistically-significant rise from 283 to 287 in 2012, on a 500-point scale). Math scores, however, show a different pattern, with declines in the 1970s followed by gains through 1992, followed by little change. But as is true for reading, math scores show no clear trend in improvement over the four-decade span (p. 1).
There has been progress in narrowing the test score gap between Whites and minorities, and between boys and girls — in all cases because the 17 year-olds with lower scores (Blacks, Hispanics and females) improved their performance. Similarly, lower-performing 17 year-olds (those scoring at the 10th and 25th percentiles) also narrowed the test score gap with their higher-performing classmates by raising their scores over time (pp. 2, 12).
The level of abilities and skills shown in these assessments are cause for concern. NAEP divides its scores (which use a 500-point scale) into 5 levels, only 3 of which are used for 17 year-olds. For reading in 2012, 18 percent of 17 year-olds did not attain the lowest level (250), 61 percent did not exceed the second highest level, and 94 percent did not reach the top level. The top level involves reading complex texts, such as scientific and historical documents, and being able to understand the links between ideas in the text (even those implicitly conveyed), and to draw appropriate generalizations. The second-highest level entails the ability to find, understand, summarize, and explain relatively complicated information. Trends for the second and third levels since 1981 reflect the general pattern noted above, while there has been little change at the most demanding level (pp. 14-15).
The proportion of 17 year-olds who reported that they read for enjoyment nearly every day has fallen significantly since comparable data began in 1984. From a roughly 30 percent level of 1984-94 (fluctuating from 27-31 percent), the proportion of those reading for enjoyment dropped to 19 percent by 2012 (p. 27). Perhaps not coincidentally, most of the decline occurred between 1994 and 1996, the same period in which the Internet became popular.
NAEP uses the same 500-point scale in math (although the scales aren’t comparable), also divided into 3 levels. To give three examples, 17 year-olds who can convert decimals into percents would likely score at the 260 level, those who can calculate a percentage are likely to score at the 325 level; and those who can interpret data from tables, charts and graphs are likely to score at the 360 level (p. 35). Except at the top level (350), performance was relatively better in math than in reading, but the tested math abilities are nevertheless a cause for concern. In 2012 all but 4 percent attained the lowest level (a score of 250), 40 percent did not perform at the second-highest level (300), and 93 percent did not attain the top level (350). Since 1978 (the earliest year reported), there has been little change in performance trends at the top level. At the second-highest level, scores dropped by 1982, then rose through 1999 (most of the rise had occurred by 1992), but there has been little change between 1992 and 2012. At the lowest level, performance improved between 1978 and 1986, but changed little thereafter (p. 37).
Since 1978, students have been taking more demanding math courses, with the proportion taking pre-calculus or calculus approximately quadrupling from the 4-6 range 1978/82 to 23 percent in 2012, and the proportion taking second-year algebra or trigonometry rising from 37 to 54 percent between 1978 and 2012. Except for a few intervals, the proportion taking more difficult math courses has continued to rise over more than three decades (p. 49). However, despite taking more advanced courses, the math test scores over this period have not measurably improved for either students scoring at the 90th percentile or (as noted above) those scoring at the top level (350).
The appendices include additional data, including on students with disabilities who had been excluded from most NAEP trend assessments until 2004 (pp. 4, 57).
For an excellent overview of the NAEP long-term tests, including an analyses of their strengths and weaknesses, which remains valuable despite its 2009 publication date, see NAEP Long-Term Trend Assessment. For a more detailed examination of trends in algebra (since 1978), see the Brookings Institution’s The Algebra Imperative: Assessing Algebra in a National and International Context.
College Degree Needed?
A new Gallup poll found that two-fifths of workers with a Bachelor’s degree-only don't believe that their job requires a Bachelor’s or more advanced degree, and 37 percent of those earning $75,000 or more annually concurred. Even 33 percent of the executive/managerial/professional respondents agreed. Conversely, 17 percent of those who had never been to college believed that their jobs required a Bachelor’s degree or more education.
See Majority of U.S. Workers Say Job Doesn't Require a Degree.
Gallup posed the following question to just over 1,000 working adults in August: “Does the type of work you do generally require a bachelor’s degree from a college or university or some other advanced academic degree?” The same question was asked in 2005 and 2002, with overall responses the same in 2005 and 2013. Gallup acknowledges that respondents may have been confused as to whether the question referred to the skills necessary to perform the work, or their employer’s educational attainment requirements for the job.
The survey had a margin of error of plus or minus 4 percentage points.
For a related recent study, see College Grads’ Job Mismatches.
Longitudinal Study of 9th to 11th Graders
The U.S. National Center for Education Statistics (NCES) has issued the first results from its follow-up of how 2009's 9th graders fared 2.5 years later, when most of the students were in the Spring term of 11th grade. The results include important findings on dropping out, progress by socioeconomic background, math scores, and students’ preparation and expectations for college and work.
See High School Longitudinal Study of 2009: First Follow-up (2012).
Only 2.7 percent of those beginning 9th grade in 2009 had dropped out by close to the end of 11th grade; another 1.7 percent were still in school, but had not yet reached the 11th grade because they’d repeated a grade. Interestingly, the combined proportion of those who had either been promoted to 12th grade (3.4 percent) or had already graduated (1.1 percent) almost exactly matched those who had dropped out or failed a grade (4.5 vs. 4.4 percent, respectively). [p. 7]
Progress by Socioeconomic Background
The study repeatedly underscored the importance of socioeconomic background in how well students progressed. NCES defined socioeconomic status using an index based on parental education and occupation, and family income. Teens were placed in one of five categories (quintiles) from the lowest to the highest scores. [p. A-9] For example, 5 percent those with the most disadvantaged socioeconomic background had dropped out by 2012, vs. only about half a percent of those with the most advantaged backgrounds. Similarly, 6.3 percent of those whose parents hadn’t finished high school had dropped out by 2012, vs. less than 1 percent of those with at least one parent who possessed a Master’s degree or more. [p. 7]
Nearly a third (31 percent) of those whose parents hadn’t finished high school expected to attain no more than a high school education, vs. 6 percent of those with at least one parent who possessed a Master’s degree or more. Conversely, 37 percent of those whose parents hadn’t finished high school expected to obtain a Bachelor’s degree or higher, vs. 79 percent of those with at least one parent who possessed a Master’s degree or more. [p. 5]
Tested Math Achievement
Achievement on a math assessment was an even better predictor of progress by 2012. [pp. 6-7, 13-14] The study included state data on math performance for public school students in 10 populous states (California, Florida, Georgia, Michigan, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, and Washington). There was surprisingly little variation in scores by state: scores (on a scale of 0-118) varied only from 37 to 40 in 2009, and from 61 to 66 in 2012. At the national level, average math achievement rose in these 2.5 years by a score of 25 points, from 38 to 64. [p. 15]
Preparation for Future Schooling and Work
Although 80 percent of these teens had done some research about college, only 63 percent had yet spoken with a high school counselor about their post-scholastic options. Nearly half (48 percent) had attended a career day or job fair. One third (34 percent) had volunteered in a job related to their career goals, and 17 percent had enlisted in a career-preferred internship or apprenticeship. [ p. 12]
Survey Background and Other Resources
NCES re-surveyed these individuals in 2013, and expects to do so again in 2016 (three years after graduation), with additional follow-ups planned until respondents reach at least age 30. The current report reflects only a small fraction of the information gathered. In addition to survey questions, students were given a mathematics assessment in algebraic reasoning and problem-solving in both 2009 and 2012. [p. 2]
NCES' High School Longitudinal Study of 2009 is the latest in a long history of U.S. Education Dept. surveys tracking the progress of a cohort of junior high or high school students. For research and data on the experiences of previous cohorts (including U.S. Labor Dept. surveys of teens), see Links to Longitudinal Surveys.
Limited Spoken English Proficiency, by State
The American Institutes for Research (AIR) has produced a set of 2-page profiles of the adult education and limited spoken-English proficiency population, for both the U.S. and the states. The profiles include adult education spending and enrollment data plus the population without a high school credential, but the most uniquely valuable data analyze spoken English proficiency by labor force status (employed, unemployed, and not in the labor force — see the symbols at the top of p. 2), gender, age, race, and Hispanic origin. The only source of current information on spoken English proficiency is the U.S. Census Bureau’s American Community Survey (ACS).
AIR has produced these profiles for the U.S. Education Department. The national profile is available at Profile of the Adult Education Target Population, and the data for each state are available at State Profiles of the Adult Education Target Population.
The ACS data can be accessed directly for current and past years — at the national, state, and local levels — at the Census Bureau’s AmericanFactFinder.
Mismatch Between College Majors and Jobs
Replicating the findings of other recent research, the polling firm Harris Interactive (in a survey done for CareerBuilder) found that 32 percent of college-educated workers reported that had not held a job related to their college major. Even restricting the findings to those aged 35 and older didn't significantly alter the results (31 percent), although with experience the degree-employment match improved, as nearly half (47 percent) of college-educated workers said their first job after college was not related to their college major. Some 36 percent of all college-educated workers wished they had chosen a different major.
The survey was conducted online from August 13 to September 6, 2013, and included a representative sample of 2,134 workers across industries and company sizes who graduated from college — but was restricted to full-time workers outside of government employment.
See One-Third of College-Educated Workers Do Not Work in Occupations Related to Their College Major.
For recent research with similar findings, see College Degree Needed? and College Graduates’ Occupational Mismatches.
Secondary Career Coursetaking Declines
The U.S. National Center for Education Statistics (NCES) has released a 2-page brief showing a roughly half-year drop in the amount of career/technical education (CTE) taken by public high school graduates between 1990 and 2009. All of the decline from 4.2 to 3.6 credits (one credit equals a full year’s coursework) occurred since 2000. This drop is in marked contrast to the significant increase in academic coursetaking over the same period.
Although occupational coursework dipped slightly, most of the drop was in non-occupational areas. Typing classes have all but disappeared, falling from .5 to .1 credits, and family and consumer education dropped from .5 to .3 credits. Only career preparation courses rose, from .4 to .6 credits over this nearly two-decade span.
The slight fall in occupation-specific coursetaking was primarily driven by a halving of business courses (.8 to .4 credits). Computer/information science classes remained unchanged at .2 credits, but communications and design courses doubled from .2 to .4 credits. Health science courses rose from almost zero to .2 credits by 2000, but have since remained at that level.
For the 2-page summary, see Trends in CTE Coursetaking, while the detailed data are available at CTE Statistics Table H125.
It should be noted that — for unknown reasons — reported transcript results differ between this source and both other published sources and from the NCES’ NAEP Data Explorer for the High School Transcript Study. For a review of coursetaking patterns that extends back to 1982 and includes interim years not covered by this new CTE report, see ETA’s High School Coursework Over 3 Decades.
12th Grade Math and Reading Scores Unchanged from 2009 to 2013
The U.S. National Center for Education Statistics has issued its latest analysis of reading and math test results for 12th graders for 2013, from the National Assessment of Educational Progress (NAEP, sometimes popularly referred to as “The Nation’s Report Card”). Comparing 2013 with the previous 2009 tests, overall average scores were identical in each subject. A separate NAEP test of 17 year-olds (most of whom were high school juniors) similarly showed steady overall average scores between the two most recent test years (2008 and 2012). For seniors, the tests have been comparable in reading since 1992, but only since 2005 in math. NAEP scales scores differently between the two subjects, with reading using a 500-point scale and math using a 300-point scale. For public school students only, NAEP also reported state-level test scores for 13 states (AR, CT, FL, ID, IL, IA, MA, MI, NH, NJ, SD, TN and WV).
Demographic and Disability Trends The high school population continues to show a rapid rise in the proportion of Hispanic 12th graders (from 7 percent in 1992 to 14 percent in 2005 to 20 percent in 2013), with a corresponding fall in White high school seniors from 74 to 58 percent from 1992 to 2013 (with little change among African American, Asian American, and Native American seniors). Students with disabilities rose slightly from 7 percent to 9 percent of 12th graders between 2005 and 2013 (the earliest and latest available years). Interestingly, despite the growth in the share of Hispanic seniors, the proportion of English-language learners fell from 4 to 3 percent between 2005 and 2013. All of the foregoing changes were statistically significant (hereafter denoted as “significant”).
Trends Common to Math and Reading Scores. Not only average scores, but scores at various percentile ranges (i.e., both high and low scores) were virtually unchanged between 2009 and 2013.
Reading Scores. For the 8 comparable reading tests given between 1992 and 2013, average scores have fluctuated with no clear pattern, falling by 4 points over the entire span (292 to 288), with 1992 being the peak and 2005’s 286 representing the lowest score (the 2013 scores are significantly higher than those in 2005). The biggest score gaps in 2013 were between English-language learners and those who were not (237 vs. 290, respectively); students with and without disabilities (252 vs. 292, respectively); those with parents who didn’t finish high school and those who graduated from college (270 vs. 299, respectively); and African Americans and Whites (268 vs. 297, respectively). The smallest differences were between city and suburban seniors (285 vs. 291, respectively), and between boys and girls (284 vs. 293, respectively). Among demographic and other groups, there were only three changes of more than 3 points for seniors between 2009 and 2013, and none of these were significant: declines among Native Americans (from 283 to 277) and English-language learners (240 to 237), and an increase from 286 to 289 among rural seniors.
Math Scores. The national average score in 2013 was 153, identical with that of 2009 but significantly higher than the 2005's score of 150 (the earliest comparable year). The biggest score gaps in 2013 were between English-language learners and those who were not (109 vs. 155, respectively); African Americans and Asian Americans (132 vs. 172, respectively); students with and without disabilities (119 vs. 157, respectively); and those with parents who didn’t finish high school and those who graduated from college (137 vs. 164, respectively). The smallest differences were between boys and girls (155 vs. 152, respectively — the reverse of the pattern for reading scores), and city and suburban seniors (149 vs. 158, respectively). Among demographic and other groups for whom data are available since 2005, the most steady trend has been the increase in the scores of Hispanics from 133 to 138 to 141 in 2005, 2009 and 2013, respectively (although the change between the latter two years was not significant). In contrast, scores among English-language learners fell from 120 to 117 to 109 in 2005, 2009 and 2013, respectively (although the change between the latter two years was not significant).
State Highlights. For public school seniors only, 2013 reading scores ranged in the 13 available states from WV’s 280 (little changed from 2009) to NH’s 295 (up 2 points from 2009), with the overall public school average at 287. In math, scores ranged from 145 in TN (no 2009 scores available) and WV (up 4 points from 2009) to 161 in NH (little changed from 2009) and MA (down 2 points from 2009), with the overall public school average at 152.
Sources. See the 2013 Mathematics and Reading: Grade 12 Assessments Home page: for detailed and customized detailed data, see the links at the bottom of the screen. The most detailed one-stop locations for data can be found at these links for reading and math, including state-specific data. For an overview of the trends in reading and math for 17 year-olds, which are available since the early 1970s, see Four Decades of Reading and Math Scores.
College Still Pays Off Abundantly
Researchers at the Federal Reserve Bank of San Francisco have produced an excellent, brief and readable analysis that concludes that “the value of college is high and not declining over time.” For most students, “tuition costs… can be recouped by age 40, after which college graduates continue to earn a return on their investment in the form of higher lifetime wages.”
Other researchers have come to the same conclusions, but the advantage of this new study is that it is based on Michigan University’s Panel Study of Income Dynamics — the nation’s longest-running longitudinal study, which began in 1968. To avoid the confounding effects of a post-graduate education, the study compares individuals with a Bachelor’s degree only against those with a high school diploma only.
By this comparison, a Bachelor’s degree has continued to reap a rich reward for more than 40 years. Even at the lowest point (1980), college graduates earned 43 percent more than their high school-educated counterparts. Over the entire four-decade plus period, the average payoff was 57 percent more per year (more than $20,000 annually).
The authors also examined three cohorts of college graduates separately (1950s-60s, 1970s-80s, and 1990s-2000s), to ascertain generational experiences. The first two cohorts’ experiences were remarkably similar, but the most recent cohort fared even better than the previous two once they had been in the labor market for more than 5 years. The college advantage also rises over one’s lifetime, a finding that replicates previous research.
See ”Is It Still Worth Going to College?” The study includes a link to a “tuition calculator” that will allow individuals to calculate the number of years after which their own earnings can be expected to exceed their college investment (see p. 4).
A related commentary from the Brookings Institution examines work by two private sector organizations that attempt to measure the payoff to attendance at individual postsecondary institutions. The author concludes that the Federal government should take the lead to link administrative data on education and earnings. See How Well Does College Pay?
A Workforce3one.org account is required to submit comments. Please login now or sign up for an account.