CSE researchers report over $11M in research grants last quarter

The awards were distributed to 18 different primary investigators.

Researchers in CSE earned over $11M in research grants in the first quarter of the 2021 U-M fiscal year (July–October, 2020). The awards were distributed to 18 different primary investigators. Learn more about the projects below, most of which began work between July and October 2020.

 

Automating the Verification of Distributed System

NSF Formal Methods in the Field – $749,943
PI: Manos Kapritsos
Co-PI: Baris Kaskci

Computer software is and has always been teeming with errors. When these errors manifest in a deployed system they can cause severe problems, including undesired behavior and unavailability of critical services. Formal verification is an approach that allows the writing of software that is provably free of such errors, but is notoriously difficult and time-consuming, which makes it harder to adopt in practice. The proposed research will investigate a new approach for automating the verification of complex software running on multiple machines, thus bringing formal verification closer to becoming a practical reality.

 

Complexity of Lattice Problems for Cryptography

NSF Algorithmic Foundations – $350,000
PI: Chris Peikert

Public-key cryptosystems are protocols that enable secure communication, even if the communicating parties have never met before. Such cryptosystems are essential to the modern Internet, where a majority of all communication is now encrypted. However, all public-key cryptosystems in widespread use today would be breakable by large-scale quantum computers. This project will inform and strengthen understanding of the security of lattice-based cryptosystems (systems built on complex, multi-dimensional grids of points), aiding ongoing standardization efforts for quantum-secure cryptography and the adoption of advanced, socially beneficial tools like encrypted computation.

 

Creating Adoptable Computing Education Integrated into Social Studies Classes

NSF – $500,000
PI: Mark Guzdial
Co-PIs – Tamara Shreiner (Grand Valley State University)

Computer science touches almost every aspect of our lives and offers great opportunity for impact. And yet, very few students take computer science courses while in high school. Even in states in which 40% or more of high schools offer a computer science class, less than 5% of students take advantage of that opportunity. This project proposes a new way to engage high school students with CS: integrate the use of purpose-built computer science tools that include programming into history courses. Teachers will be involved in the design process so that the curriculum and technology meets teachers’ perceptions of usefulness and usability. Students who take the courses will learn data manipulation skills as a part of completing their history assignments.

 

Data Processing Against Synchronization Errors

NSF Communication and Information Foundations – $489,159
PI: Mahdi Cheraghchi

Robustness against errors is a fundamental aspect of the design of reliable communications systems, and one fundamental source of error in these systems is the lack of adequate synchronization between the source and the receiver. The advent of new DNA sequencing technologies, practical DNA storage systems, and DNA computing, which are expected to fundamentally impact the future of computing, brings renewed interest in robustness against synchronization errors. This research project addresses fundamental challenges caused by imperfect synchronization in communications and information systems. The scientific impacts of the project span technologies like DNA computing that are of key significance to the processing, transmission, and storage of massive data. The project puts forth the novel hypothesis that the deletion and Poisson capacity problems, two of the longest-standing open problems in classical information theory since the 1960s, have remained unresolved due to shared mathematical difficulties, and thus should be studied with a shared perspective. Finally, the project studies the complexity of a problem that provides a mathematical model to study the capacity of DNA storage systems.

 

Developing Methods for Automated Assessment

Department of Defense – $412,957
PI: H.V. Jagadish (with the University of Texas at Austin and Perigean Technologies)

The US Army has many training programs that each require assessment. The goal of this project is to read training documentation, such as field manuals, and develop assessment questions from it in an automated manner. To do so, the team will first identify concepts in the manual, determine their importance, and identify salient facts about each concept that can form the basis for a question.

 

Foundations of Quantum Algorithms

Department of Defense – $149,177
PI: Yuri Gurevich

The study of quantum algorithms has concentrated mainly on particular types of computation, particular algorithms and particular problems. Relatively little has been done in the direction of rigorous analysis based on first principles rather than on ad hoc decisions. There is not always a clear distinction between essential aspects of quantum computation and convenient “without loss of generality” conventions. The goal of this project is to develop a rigorous theory of quantum algorithms with a minimum of arbitrary presuppositions, relying as far as possible on first principles. This will include transforming various parts of quantum computation theory from basically true, with various specified or unspecified caveats, to literally true, with precise definitions and complete proofs and with all caveats spelled out.

 

Identifying Educational Conceptions and Challenges in Cybersecurity and Artificial Intelligence

NSF EAGER – $300,000
PI: Atul Prakash
Co-PIs: Mark Guzdial, Emily Mower-Provost

Artificial intelligence (AI) has significant applications to many data-intensive emerging domains such as automated vehicles, computer-assisted medical imaging, behavior analysis, user authentication, cybersecurity, and embedded systems for smart infrastructures. However, there are unanswered questions relating to trust in AI systems. There is increasing evidence that machine learning algorithms can be maliciously manipulated to cause misclassification and false detection of objects and speech. This project brings together experts from the areas of education, AI, and cybersecurity to identify challenges and potential solutions to teaching topics in trustworthy AI with the goal of evolving coursework that will appeal to, and engage, a diverse student body.

 

Ironpatch: Automatic Generation of Assured Micropatches

Department of Defense – $1,800,000
PI: Baris Kasikci
Co-PIs: Manos Kapritsos, Westley Weimer, Kevin Leach

Even the most advanced cars and other vehicles hide a rat’s nest of electronics—hundreds of processors and millions of lines of code that were designed separately but now must work together under the hood for years at a time. Keeping such a hodge-podge of systems updated and free of security vulnerabilities is exceedingly difficult. Ironpatch aims to develop a self-contained patching system to solve the growing problem of security vulnerabilities in cars and large vehicles like trucks and spacecraft. It does so by bypassing the source code and instead making tiny modifications called micropatches directly to the binary heart of the running software.

 

Knowledge Network Infrastructure with Application to COVID-19 Science and Economics

NSF Convergence Accelerator – $4,994,542
PI: Michael Cafarella
Co-PIs: Matthew Shapiro (U-M Dept. of Economics), Oren Etzioni (AIlen Institute for AI)

The goal of this project is to build infrastructure for efficient construction of knowledge networks and applications, as well as to demonstrate the system with concrete knowledge networks that describe COVID-19 science and economics. In the short term, this work will lead to high accuracy data resources that will be useful to scientists and policy makers in addressing the virus and its economic impact. The project will create programming tools that will make knowledge networks and their applications far less expensive to build. Because knowledge networks combine unique data analysis qualities with the topical breadth of the entire World Wide Web, the potential growth of knowledge tools is very large and potentially transformative.

 

Leveraging Everyday Usage of Programs to Eliminate Bugs

NSF CAREER – $576,000
PI: Baris Kasikci

This project will devise a system that can automate some of the most difficult steps in tracking down bugs in software. By collecting and analyzing data already being produced by commonly used software, the system will learn key features of program bugs and errors that can automate their detection in the future. In the end, determining the most helpful kind of data from a program’s execution could lead to an automated tool to determine which of a program’s states are safe and which are likely to produce bugs, vastly cutting down the manual work for developers.

 

Speech-Centered Robust and Generalizable Measurements of “In the Wild” Behavior for Mental Health Symptom Severity Tracking

NSF Robust Intelligence – $450,000
PI: Emily Mower-Provost

Bipolar disorder is a common and chronic illness characterized by pathological swings from euthymia (healthy) to mania (heightened energy) and depression (lowered energy). Mood transitions are associated with profound consequences to one’s personal, social, vocational, and financial well-being. Current management is clinic-based and dependent on provider-patient interactions. Yet, increased demand for services has surpassed capacity, calling for radical changes in the delivery of care. This project will create new algorithms that can process speech data naturally collected from smartphone use to measure behavior and changes in behaviors and to associate these measurements with the severity of the symptoms of bipolar disorder. This will lead to the creation of new early warning signs, indications that clinical intervention is needed.

 

Taming the Instruction Bottleneck in Modern Datacenter Applications

NSF Foundational Microarchitecture Research – $360,000
PI: Baris Kasikci

Data centers are the power plants that drive the digital economy. Yet, as vast as these computing resources are, the amount of code they are tasked with running is larger still and growing at around 20% each year. This project will explore new hardware and software mechanisms to allow large data center programs to better fit within their limited computing resources, to improve data center program performance and energy efficiency, and reduce the toll that power-hungry data centers take on our planet. The project will leverage new hardware mechanisms for profiling program execution and use that profiling information to adjust the code of the program at runtime to make better use of on-chip caches.

 

Understanding Hand Interaction In The Jumble of Internet Videos

NSF Robust Intelligence – $436,971
PI: David Fouhey

Hands are the primary way that humans interact with and manipulate the world. Intelligent machines will need to be able to understand how humans use their hands if they are to understand human actions and to work in the world humans have built with their hands. Unfortunately, videos that show people using their hands are surprisingly difficult to understand for current artificial intelligence (AI) systems. This project develops systems that can enable learning about how humans use their hands from large scale Internet video data. As hands are central to many other areas of study, this project has the potential to empower research in many other disciplines. For instance, robotics researchers may use the systems to teach robots how to interact with objects by observation. Similarly, kinesiologists and mechanical engineers who study how the human hand is used could use the systems to better quantify hand motions and thus improve the lives of people.

 

Wizkit: Wide-scale Zero-Knowledge Interpreter Toolkit

DARPA Securing Information for Encrypted Verification and Evaluation – $184,521
PI: Daniel Genkin (sub-contractor with Northwestern University, Texas A&M, University of Vermont, and Ligero under Stealth Software Technologies, Inc.)

An immense need exists in data-sensitive applications for an assurance of veracity on sensitive statements, or programs using private data. These zero-knowledge computations, such as cryptocurrencies, need a way to prove their trustworthiness without revealing the content of the data within for the sake of privacy. Data scientists are constantly faced with the problem of reproducibility of their results when they perform published algorithms and methods on private data. Machine learning and private data-mining algorithms, compliance to legal frameworks and proof of non-disclosure of personal data, and proof of differential privacy are all other important impacts. This work aims to allow for the compilation of interpreters to support zero-knowledge statements, enabling zero-knowledge cryptographical proofs for 12 important classes of statements and beyond. If this work is successful, it will usher in a wide range of capabilities for providing assurance of the veracity of sensitive statements.

Explore:
Atul Prakash; Baris Kasikci; Christopher Peikert; Daniel Genkin; David Fouhey; Emily Mower Provost; H.V. Jagadish; Mahdi Cheraghchi; Manos Kapritsos; Mark Guzdial; Michael Cafarella; Research News; Roya Ensafi; Westley Weimer; Yuri Gurevich