You are here

PDC Institute Planning Grant - Final Report

Final Report - Collaborative Research: CyberTraining: Conceptualization: Planning a Sustainable Ecosystem for Incorporating Parallel and Distributed Computing Into Undergraduate Education*

(NSF AWARD # 2002649, PROJECT PERIOD: 09/20/2019 - 12/31/2022)

Sushil K Prasad, University of Texas at San Antonio

Anshul Gupta, IBM Research

Alan Sussman, University of Maryland

Charles Weems and Neena Thota, UMass, Amherst

Ramachandran Vaidyanathan, LSU

Center for Parallel and Distributed Computing Curriculum Development and Educational Resources (CDER)

June 6, 2023

Link to the Final Project Report to NSF


In this era of pervasive multicore machines, GPUs, cloud services, big data, machine learning, and the Internet of Things, CDER has been working over a decade to infuse parallel and distributed computing (PDC) concepts into undergraduate computing curricula with clear implications for national security, research and development, and workforce and economic development. Sponsored by an NSF institute conceptualization planning grant, CDER has conducted an extensive stakeholder community input study via a series of four workshops to identify the underlying impediments to more effective adoption and to formulate the key attributes of an institute to help eliminate the longstanding barrier of the sequential computing paradigm (Award #2002649). The breadth and complexity of the impediments and the scale of impact together justify the need for an institute-level investment to bring computing curricula into the 21st century.

The Problem: Modern computing systems pervasively employ PDC, which new computer science and computer engineering graduates (and graduates of computational and data related disciplines) are not prepared to work with, according to multiple stakeholder groups (national labs, research institutions, government agencies including DoD and DOE, and industry). Recent graduates do not understand the modern software development process and ecosystem, which relies on parallelism, distribution, asynchrony, scaling, integration across disparate libraries and data sources, test-driven design, and pervasive concerns for security.

Due to this lack of preparation, the nation is suffering a loss of competitiveness with respect to other countries that are addressing the problem (including China, India, and Russia). Affected areas include national security (cybersecurity, intelligence analysis, cyber warfare, defense systems), research and development (high performance computing, data science, AI, and modeling for critical applications such as pharmaceuticals, advanced materials, climate forecasting, chip design and manufacturing), and the economy (high onboarding cost of new graduates, cloud computing infrastructure, renewable energy, smart electrical grid, secure e- commerce, and healthcare systems).

The lack of preparation originates in the obsolete model of computing taught at the introductory level. This model, based on systems of the late 1970's, presents computing as inherently sequential and synchronous, in a uniprocessing context, with a local file system, and a command-line user interface. The model continues through the coverage of data structures, algorithms, complexity theory, and software engineering, leading to a computational problem-solving mindset that is inherently sequential, and thus unable to engineer software for modern systems. Once this model is established, some students may encounter PDC in an ad-hoc manner via electives on web programming, operating systems, parallel computing, etc., but then it is treated as a specialized concept that is only relevant within a limited context.

As an example, to illustrate the scale of economic impact, we heard from employer stakeholders that it takes up to six months of training to onboard new computer science graduates so they can be productive. Based on data from the Bureau of Labor Statistics and the National Center for Education Statistics, even if only 20% of new graduates require this amount of training, modernizing the computing curricula would have an ancillary impact on the US economy saving $880B per year. Education stakeholders said that changing the model is nearly impossible because of a lack of incentives and systemic impediments. Industry still hires graduates, there is no funding to support major curriculum change, and accreditation standards don't require it, so there is little pressure to change. Impediments include a lack of exemplars of modern curricula, few teaching materials, lack of instructor training in modern computing, lack of release time for updating courses, articulation agreements with community colleges, advanced placement exam standards, development environments and PDC language features that are not suitable for beginners, and a lack of computing resources and support for students to experience scaling of parallelism and distribution.

The Solution: The model of computation that is used to introduce computing students to the concepts and approaches of computational problem solving needs to be updated to reflect modern PDC systems, tools, methods, algorithms, use cases, challenges, datasets, and threats. It will take a significant investment to overcome the inertia in the education system and to change the expectations of employers and accreditation boards. Funding is needed not just to enable but to incentivize pioneering efforts to develop exemplar courses and curricula in a diverse set of institutions. This set must be large enough to create a critical mass of change and a bandwagon effect across the computing disciplines. This also opens up an opportunity for curriculumbased rethinking around broadening participation in computing goals. The pioneering efforts need to be coordinated so that their work can be collaborative, both to leverage each other's efforts, and to produce approaches that are sufficiently coherent to create a new standard for accreditation and advanced placement, as well as articulation across multi-institute systems.

In addition to coordinated curriculum development teams, there is a need for a centralized repository of developed materials that is curated to ensure quality and portability. As the exemplars are developed, there must be an effort to transfer them to new institutions, so as to evaluate their effectiveness in different contexts, which also necessitates investment in training and support for adoption. The need for studentfriendly software, turnkey environments, and broad dissemination implies centralized technical support during the pioneering and transfer phases. An outreach team is needed to shift employer expectations, foster collaborative partnerships, and work with standards organizations. Perhaps most importantly, there is a need for a marketing team to continually assess stakeholder needs, provide input to the other efforts, and promote the benefits of the changes to each stakeholder community.

Conclusion: It is clear that incremental or piecemeal approach has not worked. Such a transformative effort requires a major institute to provide the necessary coordination, well-structured services, and focal point to ensure success. The expected benefit of this investment is a modernized computing workforce that restores and maintains competitiveness of the US with respect to national defense, research and development, and cyberinfrastructure in support of scientific, social, and commercial goals. The savings to the US economy will provide an annual return on the investment that will be four orders of magnitude compared to the total cost of the program.

                                                Table of Contents

2.1 Stakeholder Input Workshops
2.1.1. First NSF Institute Planning Workshop, Nov 18, 2019, Denver, alongside SC'19 (in person)
2.1.2 2nd NSF Planning Workshop, March 11, 2020, alongside SIGCSE'20 (online)
2.1.3 3rd NSF Institute Planning Workshop, July 17, 2020 (online)
2.2 Institute Design Input Workshop, March, 2021 (online)
2.3 Final Reporting-out Workshop to NSF, Alexandria, VA, Oct 22 2021 (hybrid)
3.1 Stakeholder Input Workshops - Summary results of 3 workshops
3.2 Institute Design Input Workshop
3.3 Final Reporting Workshop

3.3.1 Community Input - The Problem - What was learned from the past four community workshops?

Summary Findings



3.3.2 Community Input - Solutions - What could an institute provide?

Summary Findings



3.3.3 Design and roadmap of an Institute on Parallel and Distributed Computing (PDC) Education

Summary - What must USA do to Modernize Computing education?

PDC Institute's Design and Priorities

Exemplar Development

Staffing, Budget estimates






A. Project Collaborators
B. Workshop Attendees

B1. Participants of the First NSF Institute Planning workshop, Nov 18, 2019, Denver

B2. Participants of the 2nd NSF Planning Workshop, March 11, 2020 (online)

B3. Participants of the 3rd NSF Institute Planning Workshop, July 17, 2020 (online)

B4. Participants of the Institute Design Input Workshop, March, 2021 (online)

B5. Participants of the Final Reporting-out Workshop to NSF, Alexandria, VA, Oct 22 2021 (hybrid)






C. Discussion Data Summary from Stakeholder Input Workshops

C1. Ist NSF Institute Planning workshop, Nov 18, 2019

C2. 2nd NSF Planning Workshop, March 11, 2020

C3. 3rd NSF Institute Planning Workshop, July 17, 2020




D. Discussion Data from Institute Design Input Workshop, March, 2021 (online)

Link to the Final Project Report to NSF


* How to cite this report: Prasad, S. K., Gupta, A., Sussman, A., Vaidyanathan, R., and Weems, C. 2023. Final Report - Collaborative Research: CyberTraining: Conceptualization: Planning a Sustainable Ecosystem for Incorporating Parallel and Distributed Computing into Undergraduate Education, NSF Award # 2002649, Online:, 65 pages.

Contact: Sushil K. Prasad,