News Literacy

Localizing Events

1. Concussions and AP Exams

I always maintain an open, active ear in search of unique topic to document for tjTODAY. When I heard disgruntled complaints about insufficient AP exam accommodations, I approached one of the affected students, Rayyan Khan, to ask her about her experience. As a lacrosse athlete, Khan had received a concussion during practice in the middle of AP season. In her opinion, she had difficulty obtaining enough extra time for her AP French Exam, an exam whose difficulty was exacerbated by her concussion.


Concussed Complications with CollegeBoard AP Exams

Spotlight on sophomore Rayyan Khan’s experience applying for AP accommodations for a lacrosse-related concussion

by Christine Zhao

img_1180
With AP season on the horizon, sophomore Rayyan Khan applied for CollegeBoard accommodations for the AP French and a Computer Science exam. Khan received a lacrosse-related concussion in March, which has impeded her academic abilities.

Over 100 Advanced Placement (AP) Exam takers each year apply for testing accommodations from the CollegeBoard through Mr. Adam Wong in Student Services. Wong is the Assessment Specialist at Jefferson, coordinating the appropriate paperwork and forms for test takers to submit to the CollegeBoard to approve. The accommodations process is heavily computerized, basing the accommodations it grants on data from thousands of previous cases of a certain illness, disorder, or condition.

“There are thousands of high schools across the US, and across the overseas countries; they also sponsor CollegeBoard exams,” Wong said. “Once they receive the information, they input it into their database over there in Princeton, and basically it’ll shoot out a comparison of the previous hundreds, thousands of previous applications, so everything’s pretty much computerized, so that’s pretty much how they determine if a student will receive 50% extra time, or in a rare case, 100% extra time.”

Requests for 50% extra time are common, Wong says, but approved requests consist of a smaller portion than of those that apply.

“At least ⅔ of [requests] are asking for 50% extended time, and then out of that 75%, maybe only 60% of that 75% would actually get approved,” Wong said.

One of this year’s applicants for 50% extra time was sophomore Rayyan Khan, who was scheduled to take the AP Computer Science (APCS) and AP French Exam during the first two weeks of May. However, two months ago, things took an unpredictable turn; on Mar. 6, Khan received a concussion at lacrosse practice, causing memory problems, light sensitivity, concentration issues, and noise-induced headaches over the course of several weeks.

“When I first went to the concussion clinic at the children’s hospital, the doctor there said that she would apply for 50% extra time, and so she requested that and had to fill out all these forms,” Khan said. “She said, based on your scores, you definitely qualify for accommodations, and I’m going request for you to get 50% extra time. So she sent an email to the school, and Mr. Wong really helped out with that.”

Despite being qualified to apply for accommodations, however, Khan’s request for 50% was denied by the CollegeBoard and replaced with extra break time within exams.

“After they made their decision, they couldn’t really appeal it,” Khan said. According to Khan, “It took 2 weeks to get it processed, so they couldn’t really appeal [the decision] because this was something like four days before the APCS test, so we didn’t have another two weeks to get everything reprocessed.”

For her AP Computer Science exam, Khan felt prepared due to the school’s computer science curriculum, and extra breaks allowed her to destress before the second half of the exam. Khan was able to “focus on the second half a lot better than I would’ve been able to do otherwise,” finishing 40 minutes before the end of the exam. On the other hand, Khan believes that greater accommodations should have been approved for her administration of the AP French Exam, which requires greater use of recall, critical reading, and vocabulary.

“That 50% extra time on [the AP French Exam] would be probably what I need, so I’m kind of disappointed that didn’t get approved for me,” Khan said. “French, for me, is a harder test itself. I was probably going to get a 4 before my concussion, and now I’m not even sure if I’m going to get that without extra time just because reading everything and translating it takes a lot longer to do because I recall vocabulary a lot more slowly.”

So while Khan understands the complications within the accommodations coordinating logistics, she hopes CollegeBoard will be more lenient and compromising with accommodations in the future to address a wider variety of situations.

“I understand that they can’t necessarily get the best perspective on a case by case basis because there’s probably so many people who are applying for these extra accommodations, and I’m lucky that I go to a school where I’m definitely prepared for the test in the first place,” Khan said. “I can understand their decision not to give me 50% extra time because that is a lot, but I wish they could be more lenient. Instead of just giving more breaks or 50% extra time, having those be the two options, they could have something more of a compromise or specialized to the individual.”


2. Harassment in the Workplace

I once heard one of my acquaintances complaining about how she constantly got hit on at the place she worked. When we were covering “Entitlement” as our cover story, that got me thinking about her comments.

Sexual harassment wasn’t a topic of discussion at our school yet; it wouldn’t be until 2 months after I introduced it by published that article that claims started emerging on social media. I decided to ask that girl about her experience and interview some guys at our school to balance coverage, get the other point of view, and include insights about why sexual harassment can occur on a societal basis.

Ensuring Accuracy

With harassment allegations, we had to be extremely careful not to generalize a statement too broad or paint things over with a broad brush; we could only summarize and lead with exactly what the sources said and clarify statements through follow-up interviews and messages. As my vocabulary expanded through my journalism experience, I’ve been able to summarize sources’ quotes using vivid language that both drew the reader in while shying away from bias or editorializing. With one of my sources (name and image blocked out to ensure confidentiality), one of her statements involved alleged assault. With my interview with the SR&O officer, he told us that he had not received any reports of assault, all of which should go through the police if reported. When she reviewed the context of her quotes, the source expressed hesitation towards the labelling of her cause as assault.

 

 

This slideshow requires JavaScript.

picture 2
Asking questions to clarify content; posted with their consent

We carefully reviewed her messages and our evidence and made sure to include that she had not reported it to the police, the reasons why she had not reported it, and what she was doing now that she realized her case might fall under assault. This ensured that we provided a truthful narrative of what happened: the school was not wrong in saying there were no reported cases of assault, only harassment, and gave balance to a piece that was focused on the how the school dealt with cases. I also only used messages to provide contextual information for a scenario (which was used for leading into quotes) and only quoted quotes from in-person interviews, ensuring that we were receiving direct information for our article. As Ms. Harris puts it, “in-person, phone call, email, and last resort, Messenger.” That’s the order of the best modes of interviewing.

When one of the interviewees declined an in-person interview and wanted to us online, we received our advisor’s word of caution to email them and follow-up in person to confirm the quotes. We learned that internet communication allows the possibility for us to be communicating with someone who’s not the person we thought we were interviewing. So, as a middle ground between messages and an in-person interview, I emailed our source with questions, answers to which I met with her in person and confirmed. I decided not to include photos of our interactions because we chose different stories to highlight in the article.


Fact-Checking

Backstory

IMG_0114.jpg
Mr. Shawn Frank, LIFT co-founder, talks to mentees at a Saturday session.

The new admissions tests have always felt personal to me. I’ve volunteered through my school’s LIFT program since sophomore year when the program was first instituted and have been leading the club as president since my junior year. Every week, I see the challenges our students face: they simply start off on a different level of math content knowledge than those at schools feeding into Jefferson (Rachel Carson, Longfellow, and even my own middle school, Rocky Run). We do our best as an outreach program to advance their love for STEM and chances of accessing the great opportunities here.

However, our main focus does tend to lean towards helping our mentees get into TJ and the results are so saddening: every time the Admissions Office changes its exam, the only thing we can continue to do is help them advance their math knowledge. When we knew what kinds of questions were on the test, we could pinpoint weaknesses and target those on an individual basis. Now, we have fewer resources than ever to help these kids get ahead.

The Article

As our mere 12 mentees were gearing up for the semifinalist round, I decided to pen my opinions into an editorial being published in our print issue on Feb. 8. I scrupulously went about this writing process: especially with such a controversial topic like this, with many people in support of the constant changes because they mean “people can’t train for the test,” I had to have lots of statistics to back me up.

I knew students from low-income backgrounds were not well-represented at TJ; in Government class, 25% of Jefferson survey respondents had a household income over $150,000. To back that claim up, I referenced the FCPS Press Release so many times Google doesn’t even keep count anymore. I looked at the numbers and the admit rates for the waived/reduced fee subgroup. I also referenced FCAG (Fairfax County Association for the Gifted)’s tables to investigate trends over the years. Surprisingly, representation has always been low: around 1-2% of the incoming class. I did notice that the number of waived/reduced fee applicants has steadily been decreasing though when looking through the charts. I had to look at the big picture, all the numbers of each year available, in order to support claims.

Reduce, Recheck, Revise

On every revision, I made sure each implication was backed by my personal experience, fact, or statistic. With this opinion article, I made sure to hit upon the Golden Trio (ethos, pathos, and logos) by anticipating objections, providing sobering memories and realities, and linking experience to analysis.

On the third revision, I took out a paragraph including Jeremy Shughart’s quote about prioritizing diversity and my thoughts. I realized TJ Admissions might have lots of issues they’re fixing and just the wrong idea of how to solve them. I might’ve been incorrect in assuming they hadn’t been focusing on diversity so I excluded those words. In this revision, I also went back and put in source citations to increase the credibility of the piece.

On the fourth revision of my article, I realized I’d incorrectly said that TJ Admissions did not offer applicants a character count for the writing portion. This they did offer that but nothing much else; I’d missed that in my research. I quickly corrected that.

Document with comments/links to sources: ACT Aspire Opinion: Sources Link
Notes taken in research process:


The Children Left Behind

TJ Admissions process disadvantages disadvantaged students

by Christine Zhao

On Feb. 3, the lucky pool of TJ semi-finalists filed into local middle schools to sit for the writing portion of their exam: three student information sheets (SIS) and a math essay. With the newly-set benchmarks for math, reading, and science scores, only the highest scoring test-takers on the first-round exam proceeded to this semi-finalist round.

Right here, in these newly set benchmarks and newly revamped admissions exam, is where the root of the problem lies. Almost every year since I applied to TJ, the Admissions Office has transformed its admissions process; though this purportedly prevents tutors from coaching students to a test, in actuality, this only makes it harder for disadvantaged groups to help themselves prepare for the exam.

The set of skills needed to excel at the exam is different from the one that’s typically taught in schools: for those who haven’t been taking test prep for years, this may as well be the first time that they’ve taken a timed, standardized test. For those who’ve prepared for years, frequent practice exams and sample prompts allow them to gain experience taking 3 hour multiple choice exams like the Quant-Q/ACT-Aspire. As a senior who went through more than three years of TJ prep classes, the contrast between tuition-based courses and free outreach programs is saddening because tuition-based programs simply have more time and money: the competitive culture that fuels TJ prep encourages students to take classes from elementary school while parents’ money purchases the best prep books and hires the best teachers.

Given the additional vague guidelines, little guidance, and a complete lack of prep material, TJ Admissions and its applicant site denies ordinary people, those without access to expensive courses, the opportunity to get ahead. The gap between applicant and finalist demographics continues to widen as fewer and fewer preparatory resources are made public for applicants.

The Root of the Problem

The current admissions process is unrecognizable but for the SIS and the teacher recommendations. In winter 2015, Admissions replaced the moral/ethical dilemma essay with one requiring applicants to calculate a math word problem. In winter 2017, Admissions replaced the former Pearson exam with one similar to the ACT, reflecting a change in testmakers.

Beyond the constant changes, the agency making the Quant-Q/ACT-Aspire doesn’t release public materials. Subsequently, already disadvantaged students have a low chance of bolstering themselves up onto the same playing field as other applicants. Nowhere on the website does it list expectations or a rubric or any sort of document I could use to prepare for the exam. In fall 2013, we were at least given a practice test in a document detailing types of expected questions.

That too is gone. In its place is a one-sentence line on the Admissions site that tells us the character limit on each portion. There is no time limit stated for the writing portions nor is there any description of what the SIS or the math essay even is. TJ Admissions also only gives a time limit and an abstract list of concepts to know for the ACT-Aspire. It’s like the Admissions Office assumes that every visitor has known the admissions process from birth. There is absolutely nothing on the website that helps you prepare for any part of the exam. With an even higher weight placed on this test than ever before, this situation inherently harms applicants and families who are new to the admissions process.

“Money doesn’t grow on trees”

At first glance, the changes on paper might seem insignificant. Isn’t the Admissions Office simply revamping the exam to increase the quality of their accepted applicants? Changing the test a little won’t matter to those who’re really qualified to get in: smart people are smart no matter what test they take.

That popular perspective stems from an ignorance about what conditions are like for people who haven’t been test-prepping for years. Income has already been correlated with test scores in two popular standardized tests, the SAT and ACT. The CollegeBoard’s “Total Group Profile Report” in 2013 and 2016 showed that each income bracket increase represented an average score increase of 10-30 points per section on the SAT (CollegeBoard doesn’t release SAT income data for the new SAT). In 2016, ACT scores were approximately 4 points higher for test-takers whose family income exceeds $80,000 a year. And according to Emeritus UCLA professor W. James Popham, “one of the chief reasons that children’s socioeconomic status is so highly correlated with standardized test scores is that many items on standardized achievement tests really focus on assessing knowledge and/or skills learned outside of school—knowledge and/or skills more likely to be learned in some socioeconomic settings than in others.” With the admissions requirements shrouded in obscurity, low-income students stand an even greater disadvantage than they do on the SAT. Families with more money can afford to give children that extra edge by signing them up for whatever prep classes they can find. They can pay money to tutoring organizations to teach their children test-taking skills, “skills learned outside of school,” and to access a cache of previous and example prompts, as I witnessed when I took TJ prep; even if prompts become outdated by test changes, even access to old prompts enables private tutoring pupils to gain an upper edge over others: pupils become accustomed to the format of the writing sections and gain an approximate idea of what to expect.

Branching Off: Who It Affects

I’ve sat down with LIFT students’ parents and witnessed their difficulties, even in the initial stages of the application process. They’re confused about which of the similar-looking buttons will truly direct them to the site that they want. The application site is difficult to navigate, featuring multiple sub-pages providing few pieces of useful information. With a family unfamiliar with the TJ prep culture, it is impossible for them to go into this test prepared.

Unfortunately, this is the reality for children of many first-generation immigrants. With each, almost annual, change, even people who try to help those disadvantaged groups are rendered helpless to help; every year, we, the mentors and teachers in the LIFT Program (an outreach program for underrepresented groups), have less of a sense as to what’s on the test and how to anticipate those types of questions. Last year, 37 LIFT mentees made it to the semi-finalist round. This year, 12 did. This trend in our own LIFT Program has simultaneously occurred with a saddening trend in the applicant pool; though the number of TJ students eligible for free/reduced lunch rose in the 2000’s, since the class of 2016 (with a spike for the class of 2020, the class for which LIFT was first implemented), even the number of reduced/waived fee applicants has steadily declined. Minus the LIFT mentees, who all have application fees waived, the admissions office could barely boast 200 reduced/waived fee applicants from across all five counties and cities.

The numbers speak for themselves. For the class of 2020, 10 out of 333 reduced/waived fee applicants were accepted, a 3% acceptance rate. For the class of 2021, 8 out of 289 were accepted, a 2.7% acceptance rate. Their representation in the incoming class has hovered around 1-2% for the past decade. In comparison, the overall acceptance rate last year was 16.9%.

The implications of TJ admissions statistics were discussed in a controversial Washingtonian article, “Does the No. 1 High School in America Practice Discrimination?”, published Apr. 26, 2017. Though I hardly think that the Admissions Office is actively discriminating against specific races or socioeconomic groups, the lack of exam transparency and any support materials inherently discriminates against low-income applicants.

The Leaves: A New Beginning

TJ Admissions needs a wake up call to reality: every other testing agency’s model has proved to work. With CollegeBoard’s partnership with Khan Academy, CollegeBoard sought to “confront one of the greatest inequities around college entrance exams, namely the culture and practice of high-priced test preparation,” CollegeBoard said in a statement on March 5, 2014, two years before the new SAT launched. Hundreds of practice questions are available for public use and all eight practice tests from the Official SAT Study Guide are published on Khan Academy, providing those without means with the same access to practice materials. This has allowed both students and the CollegeBoard to prosper; six hours on Khan Academy increases scores by an average of 90 points and the number of SAT takers has risen by 15 percent. The CollegeBoard rebranded its SAT into a standardized test more approachable than ever, while every single time our admissions exam changes, it moves backwards.

The Admissions Office needs to provide more notice of any exam changes and free, comprehensive preparation materials for the redesigned exam, just like the CollegeBoard did. If the test makers still refuse to release materials publicly, then a possible compromise can and must be made: find another testing agency, provide reasoning for and details of the eventual change to the public, and work for two to three years to make the CollegeBoard standard a reality. Even the New York’s SHSAT (Specialized High School Admissions Test), long found to hold similarities to the Pearson admissions exam, provided a comprehensive 15-page document when it instituted a new exam in fall 2017 (with translations in multiple languages for those not fluent in English). This document fully detailed changes and answered frequently asked questions. Then, to follow that up, the NYC Department of Education released a 160-page student handbook with general directions, guidelines, and sample problems and tests. Though it doesn’t necessarily need to be as large scale as the Khan Academy/CollegeBoard partnership, the Admissions Office needs to offer a web platform somewhere in-between one sample practice test and Khan Academy’s individualized problem sets.

Right now it is offering nothing. And that needs to be fixed.

The movement begins with you. A change occurs when ordinary people speak up against the status quo. Take a few seconds or minutes to pen an email to the Admissions Office (tjadmissions@fcps.edu); bring up your own points or mention some of mine. We, the students, staff, and alumni, demand more transparency with clear-cut guidelines and a well-developed system of preparation. Our power comes from empathizing with the decreasing opportunities for future applicants, applicants, bright and driven, but underprepared for the system of exams, even when the ultimate effect bears no immediate effect on ourselves.