Impact of Computing (CSP) — AP CS Principles CSP Study Guide
For: AP CS Principles candidates sitting AP Computer Science Principles.
Covers: Beneficial vs harmful effects of computing innovations, the digital divide, environmental impacts of computing systems, and bias in algorithmic decision-making, aligned with the AP CS Principles CED.
You should already know: No prior CS required.
A note on the practice questions: All worked questions in the "Practice Questions" section below are original problems written by us in the AP CS Principles style for educational use. They are not reproductions of past College Board papers and may differ in wording, numerical values, or context. Use them to practise the technique; cross-check with official College Board mark schemes for grading conventions.
1. What Is Impact of Computing (CSP)?
Impact of Computing is the study of how computing innovations, systems, and tools shape individual experiences, societies, economies, cultures, and the natural world, for both positive and negative ends. It is a core unit of the AP CSP syllabus, making up 10-13% of your final exam score, and is also a required component of the Create Performance Task worth 30% of your total grade. Common synonyms for this topic include "societal impact of technology" and "computing ethics". Examiners will regularly ask you to evaluate tradeoffs of real-world computing innovations, identify systemic harms, and propose practical mitigation strategies.
2. Beneficial vs harmful effects
Every computing innovation has both intended effects (the explicit purpose it was designed to serve) and unintended effects (unforeseen positive or negative outcomes that emerge after widespread use). No innovation is entirely beneficial or entirely harmful; you will almost always be asked to evaluate both sides for full marks on exam questions. Intended beneficial effects are usually straightforward: for example, a food delivery app is designed to reduce wait times for restaurant meals and create flexible income opportunities for delivery workers. Unintended beneficial effects may include reduced drunk driving incidents, as users can order food instead of driving to restaurants after drinking, or increased revenue for small local restaurants that cannot afford their own delivery infrastructure. Unintended harmful effects of the same app may include increased traffic congestion in dense urban areas, wage theft for delivery workers who are classified as independent contractors, and increased plastic waste from single-use delivery packaging.
Worked Example
A public school district adopts a free AI tutoring app for all middle school students, designed to give personalized math practice and instant feedback on homework. List 1 intended beneficial effect, 1 unintended beneficial effect, and 1 unintended harmful effect of this innovation.
Solution
- Intended beneficial effect: Reduces teacher workload for grading routine math assignments, and gives students immediate feedback to fix knowledge gaps instead of waiting days for graded work.
- Unintended beneficial effect: Students with social anxiety who are too shy to ask for help in class can access extra practice and support anonymously, reducing achievement gaps for neurodivergent students.
- Unintended harmful effect: The app collects data on student performance and learning habits that is sold to third-party advertising companies, putting student privacy at risk.
3. Digital divide
The digital divide refers to the systemic gap between demographic groups or geographic regions that have reliable, affordable access to high-speed internet, modern computing devices, and digital literacy training, and those that do not. It is not just a gap in device ownership: it also includes access to accessible technology for disabled users, age-related gaps in digital skills, and gender-based barriers to tech access in many regions. Common root causes of the digital divide include:
- Socioeconomic status: Low-income households are 3x less likely to have high-speed home internet than high-income households in the U.S.
- Geography: 25% of rural U.S. households lack access to broadband infrastructure, compared to 2% of urban households.
- Disability: Only 40% of public-facing websites meet accessibility standards for visually or hearing impaired users.
- Age: 30% of adults over 65 lack basic digital literacy skills to use common online services. Impacts of the digital divide include limited access to remote education, telemedicine services, online job applications, government benefits, and civic participation tools like online voting registration.
Worked Example
A state government announces that all applications for unemployment benefits will be moved to an online-only portal, with no phone or in-person application options available. Explain how this policy exacerbates the digital divide for state residents.
Solution
Residents without home internet, residents of rural areas without broadband access, seniors without digital literacy skills, and disabled users who cannot access the non-accessible portal will be unable to apply for unemployment benefits they are eligible for. This disproportionately harms marginalized groups, widening existing economic and social inequalities in the state. Valid interventions to reduce this harm include setting up free application support centers in public libraries, adding a phone application option, and updating the portal to meet accessibility standards.
4. Computing and the environment
Computing systems have both negative and positive impacts on the natural environment, and exam questions will often ask you to evaluate these tradeoffs for full marks.
Negative impacts
- E-waste: Discarded electronics (phones, laptops, servers, batteries) contain toxic materials including lead, mercury, and cadmium that leach into soil and groundwater when dumped in unregulated landfills. Only 17% of global e-waste is properly recycled per 2025 UN data.
- Greenhouse gas emissions: Data centers running cloud services, AI model training, and crypto mining consume massive amounts of electricity, often from fossil fuel sources. Training a single large language model can emit as much carbon as 500 passenger cars driving for one year.
Positive impacts
- Smart grid technology: AI-powered electricity grids adjust energy distribution based on real-time demand, reducing overall energy waste by up to 15% per U.S. Department of Energy data.
- Precision agriculture: Sensors and AI tools help farmers reduce water use by 30% and fertilizer use by 25%, reducing agricultural runoff and water pollution.
- Renewable energy optimization: Computing systems optimize the storage and distribution of solar and wind power, making renewable energy sources more reliable and cost-competitive with fossil fuels.
Worked Example
A small business replaces its on-premise data servers with cloud hosting from a provider that runs all its data centers on 100% solar energy. The on-premise servers used 8,000 kWh of electricity per year, from a local grid that emits 0.4 kg of CO₂ per kWh. The equivalent cloud service uses 5,000 kWh of electricity per year from zero-emission solar sources. Calculate the annual carbon emission reduction from this change, showing your work.
Solution
First calculate annual emissions from the on-premise servers: The cloud service emits 0 kg of CO₂ from electricity use. Total annual reduction: , equivalent to taking 0.7 gas-powered cars off the road for a year.
5. Bias in algorithms
Algorithm bias refers to a systematic pattern of unfair or discriminatory outputs from an automated decision-making system for specific groups of users, as opposed to random one-off errors. Bias almost always originates from human choices, not inherent "flaws" in the technology itself: common root causes include biased training data, lack of diverse testing across user groups, and historical systemic bias embedded in input datasets. Common real-world examples of algorithm bias include:
- Facial recognition systems with 30x higher error rates for Black women than white men, leading to wrongful arrests of Black users.
- Hiring algorithms that penalize resumes that include references to women's organizations (e.g. "Women in Engineering Club"), leading to 40% fewer interviews for female candidates.
- Loan approval algorithms that penalize applicants living in majority-Black neighborhoods, perpetuating historical redlining practices that block Black families from accessing home loans.
Worked Example
A city uses an algorithm to allocate public housing units to eligible applicants, trained on 10 years of historical public housing allocation data. The algorithm approves housing applications for white applicants at a 60% higher rate than Black applicants with identical income, family size, and housing need scores. Explain the root cause of this bias, and one practical mitigation step.
Solution
The bias originates from the historical training data, which reflects past systemic racism in the city's public housing allocation process, where Black applicants were systematically denied housing at higher rates than identical white applicants. The algorithm learns this historical pattern and reproduces it in its outputs, perpetuating discrimination. A valid mitigation step is to conduct regular third-party audits of the algorithm to identify disparate impact across racial groups, and remove any input variables that correlate with race and lead to unfair outcomes.
6. Common Pitfalls (and how to avoid them)
- Pitfall 1: Only listing intended effects when asked to evaluate a computing innovation's impact. Why students do it: They only focus on the explicit stated purpose of the innovation and forget to address unintended outcomes. Correct move: Always list at least 1 intended beneficial, 1 unintended beneficial, and 1 unintended harmful effect for impact evaluation questions to earn full marks.
- Pitfall 2: Defining the digital divide as only "not having a phone". Why students do it: They oversimplify the multiple layers of the divide. Correct move: Mention that the digital divide includes lack of high-speed internet, digital literacy training, and accessible technology for disabled users, not just device ownership, to score all available points.
- Pitfall 3: Classifying all algorithm errors as algorithm bias. Why students do it: They mix up random individual errors and systematic group-level harm. Correct move: Only classify an error as bias if it is a consistent, systematic pattern of unfair outcomes for a specific group of users, not a one-off mistake for an individual user.
- Pitfall 4: Only discussing negative environmental impacts of computing when asked for an evaluation. Why students do it: They focus on well-publicized harms of crypto mining and data centers and forget positive environmental innovations. Correct move: Address both positive and negative environmental impacts for balanced evaluation questions to earn full marks.
- Pitfall 5: Suggesting "ban the algorithm" as the only mitigation for algorithm bias. Why students do it: They fail to think of practical, actionable interventions. Correct move: List concrete steps like using diverse representative training data, conducting regular third-party audits, testing with diverse user groups, and adding human oversight of high-stakes automated decisions for higher marks.
7. Practice Questions (AP Computer Science Principles Style)
Question 1
A public library system introduces a free e-book lending app that allows users to borrow digital books, audiobooks, and online courses from their phone or computer, no in-person visit required. (a) Identify one intended beneficial effect of the app. (b) Identify one unintended harmful effect related to the digital divide. (c) Identify one policy intervention the library could implement to reduce the harm identified in part (b).
Solution
(a) Intended beneficial effect: Reduces barriers to reading for users with mobility impairments who cannot travel to physical library locations, and expands access to educational resources for all library users. (b) Unintended harmful effect: Low-income users without a smartphone or home internet cannot access the e-book lending service, widening the gap in access to educational resources between low-income and high-income library users. (c) Intervention: The library could expand its free device lending program to lend tablets and mobile hotspots to users for home use, so all users can access the e-book app regardless of income.
Question 2
A social media platform trains an algorithm to recommend content to users, designed to maximize time spent on the platform. The platform finds that the algorithm disproportionately recommends false climate change misinformation to users under 18 living in rural areas. (a) Explain why this is an example of algorithm bias. (b) Describe one negative societal impact of this biased algorithm. (c) Describe one step the platform could take to reduce this bias.
Solution
(a) This is algorithm bias because it produces a systematic pattern of harmful outputs (misinformation recommendations) for a specific group (rural teen users), rather than random one-off errors. The bias likely originates from training data that shows rural teen users engage more with sensational climate content, so the algorithm learns to prioritize that content even if it is false. (b) Negative societal impact: Rural teen users are more likely to be exposed to false climate information, reducing their ability to make informed decisions about climate action and undermining public support for climate policy. (c) Mitigation step: The platform could add a fact-checking layer to the recommendation algorithm that reduces the priority of content verified as false by independent fact-checkers, and test the algorithm across diverse age and geographic user groups to identify disparate impact before deployment.
Question 3
A university replaces its on-campus computer labs with remote cloud-based lab access for all students. The on-campus labs used 22,000 kWh of electricity per year from a grid that emits 0.45 kg of CO₂ per kWh. The cloud lab service uses 14,000 kWh of electricity per year from a data center powered by 100% hydroelectric energy (zero emissions). Calculate the annual CO₂ emission reduction from this change, showing your work.
Solution
First calculate annual emissions from the on-campus labs: The cloud service emits 0 kg of CO₂ from electricity use. Total annual reduction: per year.
8. Quick Reference Cheatsheet
| Subtopic | Key Definitions | Key Facts | Exam Tips |
|---|---|---|---|
| Beneficial vs Harmful Effects | Intended effect = designed purpose of an innovation; Unintended effect = unforeseen positive/negative outcome | All computing innovations have both intended and unintended effects | Always list 1 intended beneficial, 1 unintended beneficial, 1 unintended harmful effect for evaluation questions |
| Digital Divide | Systemic gap between groups with reliable access to internet, devices, digital literacy, and those without | Causes include income, geography, age, disability, gender; impacts include limited access to education, healthcare, jobs | Valid interventions: public library internet/device lending, rural broadband funding, free digital literacy programs |
| Computing and the Environment | E-waste = discarded electronics with toxic materials; Data center emissions = CO₂ from electricity used to run servers | 17% of global e-waste is recycled; smart grids reduce energy waste by 15% | Address both positive and negative environmental impacts for balanced evaluation questions |
| Bias in Algorithms | Systematic pattern of unfair outputs for specific groups from automated systems | Root causes: biased training data, lack of diverse testing, historical bias in input data | Only classify systematic group-level harm as bias (not one-off errors); valid mitigation steps: diverse training data, third-party audits, human oversight |
9. What's Next
Mastering the Impact of Computing unit directly supports your success in two other high-weight components of the AP CSP syllabus. First, it is a required part of the Create Performance Task, where you will need to evaluate the potential positive and negative societal impacts of the computing innovation you design, to earn full marks on that 30% of your exam score. Second, it connects to the Data Analysis unit, where you will learn to work with datasets and train simple machine learning models, and will need to identify potential bias in your data and outputs to avoid unfair outcomes. Understanding these concepts will also help you answer multiple-choice questions on the end-of-course exam, where 10-13% of questions are focused on this topic. If you have any questions about the impact of computing, digital divide interventions, algorithm bias mitigation, or any other AP CSP topic, you can ask Ollie, our AI tutor, at any time for personalized explanations, extra practice questions, or feedback on your Create Performance Task draft. Head to the homepage, to get started with your AP CSP exam prep today.