Results 1 to 18 of 18

Thread: College/Uni Exit Test?

  1. #1

    Default College/Uni Exit Test?

    Smart idea or the next thing to be gamed? As much as I agree too many schools are not providing the best post-secondary education, I don't think centralized standardized testing is going to make people better employees. Anyone wanna convince me otherwise? I'm open to it.

    Colleges Set to Offer Exit Tests
    Employers Say They Don't Trust Grade-Point Averages

    Next spring, seniors at about 200 U.S. colleges will take a new test that could prove more important to their future than final exams: an SAT-like assessment that aims to cut through grade-point averages and judge students' real value to employers.

    The test, called the Collegiate Learning Assessment, "provides an objective, benchmarked report card for critical thinking skills," said David Pate, dean of the School of Arts and Sciences at St. John Fisher College, a small liberal-arts school near Rochester, N.Y. "The students will be able to use it to go out and market themselves."

    The test is part of a movement to find new ways to assess the skills of graduates. Employers say grades can be misleading and that they have grown skeptical of college credentials.

    "For too long, colleges and universities have said to the American public, to students and their parents, 'Trust us, we're professional. If we say that you're learning and we give you a diploma it means you're prepared,' " said Michael Poliakoff, vice president of policy for the American Council of Trustees and Alumni. "But that's not true."

    The new voluntary test, which the nonprofit behind it calls CLA +, represents the latest threat to the fraying monopoly that traditional four-year colleges have enjoyed in defining what it means to be well educated.

    Even as students spend more on tuition—and take on increasing debt to pay for it—they are earning diplomas whose value is harder to calculate. Studies show that grade-point averages, or GPAs, have been rising steadily for decades, but employers feel many new graduates aren't prepared for the workforce.

    Meanwhile, more students are taking inexpensive classes such as Massive Open Online Courses, or MOOCs, but have no way to earn a meaningful academic credential from them.

    HNTB Corp., a national architectural firm with 3,600 employees, see value in new tools such as the CLA +, said Michael Sweeney, a senior vice president. Even students with top grades from good schools may not "be able to write well or make an argument," he said. "I think at some point everybody has been fooled by good grades or a good resume."

    The new test "has the potential to be a very powerful tool for employers," said Ronald Gidwitz, a board member of the Council for Aid to Education, the group behind the test, and a retired chief executive of Helene Curtis, a Chicago-based hair-care company that was bought by Unilever in 1996.

    Only one in four employers think that two- and four-year colleges are doing a good job preparing students for the global economy, according to a 2010 survey conducted for the Association of American Colleges and Universities.

    Meanwhile, GPAs have been on the rise. A 2012 study looking at the grades of 1.5 million students from 200 four-year U.S. colleges and universities found that the percentage of A's given by teachers nearly tripled between 1940 and 2008. A college diploma is now more a mark "of social class than an indicator of academic accomplishment," said Stuart Rojstaczer, a former Duke University geophysics professor and co-author of the study.

    Employers such as General Mills Inc. and Procter & Gamble Co. long have used their own job-applicant assessments. At some companies such as Google Inc., GPAs carry less weight than they once did because they have been shown to have little correlation with job success, said a Google spokeswoman.

    At Teach for America, which recruits college students to teach in rural and urban school districts, the GPA is one of just dozens of things used to winnow nearly 60,000 applicants for 5,900 positions. Candidates who make it to the second step of the process are given an in-house exam that assesses higher-order thinking, said Sean Waldheim, vice president of admissions at the group. "We've found that our own problem-solving activities work best to measure the skills we're looking for," he said.

    The Council for Aid to Education, the CLA + test's creator, is a New York-based nonprofit that once was part of Rand Corp. The 90-minute exam is based on a test that has been used by 700 schools to grade themselves and improve how well their students are learning.

    The CLA + will be open to anyone—whether they are graduating from a four-year university or have taken just a series of MOOCs—and students will be allowed to show their scores to prospective employees. The test costs $35, but most schools are picking up the fee. Among schools that will use CLA + are the University of Texas system, Flagler College in Florida and Marshall University in West Virginia.

    The CLA + is scored on the 1600-point scale once used by the SAT "because everyone is familiar with that," said Chris Jackson, director of partner development at the Council for Aid to Education. Instead of measuring subject-area knowledge, it assesses things like critical thinking, analytical reasoning, document literacy, writing and communication.

    Cory LaDuke, a 21-year-old senior at St. John Fisher, said he had mixed feelings about taking the CLA + but understood why employers might be skeptical of some graduates because "some people don't work that hard and fake their way through it," he said.

    "It kind of sucks that an employer can't trust your GPA, but that's the way it is right now, so this also an opportunity," said Mr. LaDuke. "It's another way to prove yourself."

    Other groups also have been seeking ways to better judge graduates' skills. The Lumina Foundation, which aims to boost the number of college graduates, is offering a way to standardize what students should know once they earn a degree. The MacArthur Foundation has helped fund a system of "badges" for online learning to show mastery of certain skills. Last Thursday, President Barack Obama said he wants the federal government to devise a ratings system to gauge colleges' performance based on student outcomes.

    Meanwhile, established testing companies are introducing new tools. Earlier this year, Educational Testing Services, which developed the Graduate Record Exam, announced two certificates to reward high marks on its Proficiency Profile, which assesses critical thinking, reading, writing and math.

    And ACT, the nonprofit that administers the college-admission exam of the same name, has a National Career Readiness Certificate, which measures skills such as synthesizing and applying information presented graphically.

    Educational Testing Services was surprised to learn through a survey last spring that more than a quarter of businesses were using the GRE to evaluate job applicants, said David Payne, an ETS vice president.

    Sean Keegan, a 2011 graduate of Tufts University, has posted his GRE on his resume because he landed in the 97th percentile, even though he isn't applying to graduate school. "I think it shows I'm relatively smart," said Mr. Keegan, who is looking for work in finance. "So far, I've gotten a lot of positive feedback from employers."

    http://online.wsj.com/article/SB1000...959843818.html

  2. #2
    If this becomes popular, how long before colleges start teaching to these tests?

    I think it makes sense for companies to have individualized exams that test for the traits they think are valuable in their environment (a bunch of consulting firms already do this). I can't see how all firms would want the same skills though.
    Hope is the denial of reality

  3. #3
    Colleges and Universities did this to themselves. What used to be an institution that served to provide a well balanced education meant to improve the lives of those who attended or provide professional degrees (teaching, law, medicine, etc), shifted over to workforce development (the traditional liberal arts degree as providing skills for the workplace) in order to increase enrollments. Now they have to justify the shift (and the tax dollars, student loan debt).

    I think we'll see colleges and universities break off into segmented types of education. You'll have the ivy league schools ignoring the test. Public colleges and universities adopting it across the board. Private middle of the road colleges taking the biggest hit and dividing up between those that need the increased enrollments and those that want to be ivy league schools.

    The best solution for business is individualized assessment systems that evaluate an applicants skills, personality, traits, to best match their needs. One standardized test given at college isn't going to accomplish it.
    Get off my lawn
    I can live without #16 and #17

  4. #4
    *shrugs* Engineers have had exit tests for a very long time. Many engineers leaving accredited schools take the Fundamentals of Engineering (FE) exam, followed up after several years of work experience by taking another exam for licensure as a Professional Engineer (PE). Europe has a somewhat similar system. This tends to only work well with traditional engineering disciplines (civil/mechanical/electrical/etc.) but is taken by quite a large number of engineers after they graduate. It helps your resume quite a bit.

    I think some concerns can be laid to rest, here. The FE exam is far more subject-specific than any pan-university test is likely to be, and I don't really see schools specifically teaching to it. In fact, while we were made aware of it in undergrad, no effort was made on part of the school to prepare for the exam. Some vague 'critical thinking skills' test isn't likely to be easy to teach to in most courses and curricula, so they'll probably ignore it and focus on making their education decent. The real focus - for engineering schools - is on acquiring and keeping certification from ABET, the main organization which evaluates undergraduate programs in engineering. That has a clear influence on curricula, and I think it's largely benign or even beneficial. If similar industry-advised groups arise for other disciplines, they would probably have a much stronger effect on improving the utility of a college education. Certainly some places are trying to make better industry-academic tie-ups, especially for 2-year colleges - expanding to a formal accreditation program is not an unreasonable step.

    I recognize that engineers are a different breed of undergraduate than other disciplines - for more skills-oriented and subject-specific than most other majors. So I would question the utility of everyone taking the same test out of uni, for the simple reason that it may be (a) irrelevant to employers for the job in question and (b) irrelevant to the student's course of study. As Loki already mentioned, I know of many employers who administer their own testing to interviewees and use that to correlate with GPA (though they don't rely on this too much - my wife, for example, with a decent GPA from a very good program, performed rather poorly on some British IQ test because of language issues, but was hired for her first job on the basis of stellar interview performance, and she made her employers very happy with their choice). So perhaps job-specific or subject-specific testing makes more sense. Certainly if they're essentially making this test vaguely like a GRE (which the article seems to imply), it's going to have a lot of clustering at high scores, making it hard to differentiate top applicants - the GRE was absurdly easy.

    I doubt this will be very useful for 'fuzzy skills' that most university graduates are supposed to have. But I doubt it will hurt that much, either.

  5. #5
    What is this supposed to be testing? People don't expect, nor are they interested in, a guy who earns a B.S. in physics to have the same skills or knowledge of someone with a B.S. in psychology. Same goes for the Linguistics guy and the History guy. Tests by professional associations and accreditation groups is one thing but a standardized test for all college graduates is ridiculous.
    Last night as I lay in bed, looking up at the stars, I thought, “Where the hell is my ceiling?"

  6. #6
    Kind of like the GRE.
    Hope is the denial of reality

  7. #7
    Quote Originally Posted by LittleFuzzy View Post
    What is this supposed to be testing? People don't expect, nor are they interested in, a guy who earns a B.S. in physics to have the same skills or knowledge of someone with a B.S. in psychology. Same goes for the Linguistics guy and the History guy. Tests by professional associations and accreditation groups is one thing but a standardized test for all college graduates is ridiculous.
    I think its an extension of the idea that a college degree (to a business) is worth less than its used to be. Many entry level BA/BS requiring jobs only require it because it shows at minimum an ability to commit. With actual job knowledge being taught while on the job.

    Sorta like how almost all college programs use Calc 2 to weed out students.
    "In a field where an overlooked bug could cost millions, you want people who will speak their minds, even if they’re sometimes obnoxious about it."

  8. #8
    Based on public university experience... a test is needed. You shouldn't be able to pass a class without actually understanding the material but that happened all the freaking time. I'd say only about 1 in 5 classes actually had professors who required mastery of the material taught in order to pass. If that.

  9. #9
    Eh, I don't think grade inflation is very relevant here. After all, if you think colleges give people the right skills but are too lenient on the grading, you can just hire anyone with a sufficiently high GPA (getting an A is still hard in most classes).
    Hope is the denial of reality

  10. #10
    Quote Originally Posted by Loki View Post
    Eh, I don't think grade inflation is very relevant here. After all, if you think colleges give people the right skills but are too lenient on the grading, you can just hire anyone with a sufficiently high GPA (getting an A is still hard in most classes).
    Outside of your entry point into work most places don't care about GPA after a couple of years of work history. However strange enough having that degree makes a huge difference even if you have 5-10 years of work history. I know I had a leg up getting a recent promotion because I had a degree and some others didn't - so it made going to college a good career move but from a business perspective it seems silly.

  11. #11
    I think it's a "covering one's ass" kind of thing. If you promote someone without a degree and they're terrible, you as the boss are going to get roasted. If you hire someone with a degree and they suck, you can reasonably claim that you couldn't see it coming.
    Hope is the denial of reality

  12. #12
    Quote Originally Posted by wiggin View Post
    *shrugs* Engineers have had exit tests for a very long time. Many engineers leaving accredited schools take the Fundamentals of Engineering (FE) exam, followed up after several years of work experience by taking another exam for licensure as a Professional Engineer (PE). Europe has a somewhat similar system. This tends to only work well with traditional engineering disciplines (civil/mechanical/electrical/etc.) but is taken by quite a large number of engineers after they graduate. It helps your resume quite a bit.

    I think some concerns can be laid to rest, here. The FE exam is far more subject-specific than any pan-university test is likely to be, and I don't really see schools specifically teaching to it. In fact, while we were made aware of it in undergrad, no effort was made on part of the school to prepare for the exam. Some vague 'critical thinking skills' test isn't likely to be easy to teach to in most courses and curricula, so they'll probably ignore it and focus on making their education decent. The real focus - for engineering schools - is on acquiring and keeping certification from ABET, the main organization which evaluates undergraduate programs in engineering. That has a clear influence on curricula, and I think it's largely benign or even beneficial. If similar industry-advised groups arise for other disciplines, they would probably have a much stronger effect on improving the utility of a college education. Certainly some places are trying to make better industry-academic tie-ups, especially for 2-year colleges - expanding to a formal accreditation program is not an unreasonable step.

    I recognize that engineers are a different breed of undergraduate than other disciplines - for more skills-oriented and subject-specific than most other majors. So I would question the utility of everyone taking the same test out of uni, for the simple reason that it may be (a) irrelevant to employers for the job in question and (b) irrelevant to the student's course of study. As Loki already mentioned, I know of many employers who administer their own testing to interviewees and use that to correlate with GPA (though they don't rely on this too much - my wife, for example, with a decent GPA from a very good program, performed rather poorly on some British IQ test because of language issues, but was hired for her first job on the basis of stellar interview performance, and she made her employers very happy with their choice). So perhaps job-specific or subject-specific testing makes more sense. Certainly if they're essentially making this test vaguely like a GRE (which the article seems to imply), it's going to have a lot of clustering at high scores, making it hard to differentiate top applicants - the GRE was absurdly easy.

    I doubt this will be very useful for 'fuzzy skills' that most university graduates are supposed to have. But I doubt it will hurt that much, either.
    Good points. My concern is materializing into this: many of the employers supposedly behind this are looking for non-technical skills, which inherently can't be tested for. It's an attempt to quantify and standardize skills that are highly qualitative.

    I think it's fine for employers to have these kinds of tests. I've taken them before and usually not done too well in them. But, with one exception, the employers who hired me are all the richer for having a rigorous qualitative evaluation process.

  13. #13
    Trump University?

  14. #14
    The new voluntary test, which the nonprofit behind it calls CLA +, represents the latest threat to the fraying monopoly that traditional four-year colleges have enjoyed in defining what it means to be well educated.
    And it only costs $35. It's an idea with great potential, but shouldn't be limited to college/university Exit Tests. The test should be open to everyone, regardless of age....even if they're HS dropouts, have a GED, or have "only" taken MOOCs. There are many ways to become well-educated without a four year degree. Being a military veteran is just one example.

  15. #15
    Trump University isn't an actual university, fraud issues aside. It's a place that only offers Real Estate seminars.

    Military personnel have access to educational facilities on many bases...
    Hope is the denial of reality

  16. #16
    Senior Member Flixy's Avatar
    Join Date
    Jan 2010
    Location
    The Netherlands
    Posts
    6,435
    I'm not opposed to people taking a test to show they know their stuff without a degree, but there's still a lot of difference between passing one test and completing a full college education.

    Over here, you need your high school diploma to enroll in most colleges, but if you're over 21 and don't have the right qualification you can have an intake interview and may be admitted anyway.

  17. #17
    The point of a exit exam is to test the core competencies (critical thinking skills, analysis, etc) which are suppose to be taught in what are traditional liberal arts courses. Most undergraduate programs consist of three levels: 1/2 Major (60 credits), 1/4 Core (30 credits), 1/4 electives (30 credits). Some majors have more or less and some core have more requirements. Professional degrees usually have professional exams (CPA, Engineering, teaching, etc) which test the knowledge gained from your major. There isn't anything to test the knowledge gained from your Core/Electives. The English course you were required to take in college was suppose to teach critical thinking skills, analysis, etc, which tend to be important skills for employment (even if the books you read are irrelevant to your job . . . Ethan Frome anyone?). If those skills are lacking it doesn't matter that you have a 4.0 GPA, you'll probably be a crappy employee. Or so the reasoning goes.

    The CLA+ and other such tests are not meant to test your "major" knowledge (business courses, physics, biology, etc), but rather the skills learned in the Core courses (Philosophy, English, social sciences, etc). Someone who never attended college can score really well on a CLA+ type exam, in theory, but they may still be lacking the knowledge gained from the college "Major." It will simply be another "thing" to put on your resume, but I doubt we'll see it replacing the college degree as being important. And I don't think that it will even the playing field for degree versus non-degree.

    The question then becomes, if you have a stack of resumes on your table, what's the order you sort by?
    1) College Degree, 2) CLA+, 3) Professional License/Certification, 4) MOOC courses, 5) Experience?
    Get off my lawn
    I can live without #16 and #17

  18. #18
    If you don't gain those critical thinking and analysis skills early on in your college life, how do you think you'd be able to do well on more advanced classes? Granted, it's hard to fail people nowadays, but there's no way most people in my department would give higher than a C+ to anyone with poor writing or analysis skills, content aside.

    Incidentally, different people are good at different tasks. Some people are just naturally better at test-taking. That's likely the main trait that is tested by one-off exams. Do you think the ability to do well on such activities is necessary for good employees in most jobs?

    In my current course, exams count for slightly under half of the final grade, given that I've been in enough classrooms to see that the people doing the best on exams aren't necessarily the people with the best understanding of the material (they do tend to be people who are best at taking exams!).
    Hope is the denial of reality

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •