
NSF Org: |
CCF Division of Computing and Communication Foundations |
Recipient: |
|
Initial Amendment Date: | July 27, 2020 |
Latest Amendment Date: | May 25, 2021 |
Award Number: | 2007718 |
Award Instrument: | Standard Grant |
Program Manager: |
Sol Greenspan
sgreensp@nsf.gov (703)292-7841 CCF Division of Computing and Communication Foundations CSE Directorate for Computer and Information Science and Engineering |
Start Date: | August 1, 2020 |
End Date: | July 31, 2025 (Estimated) |
Total Intended Award Amount: | $250,000.00 |
Total Awarded Amount to Date: | $266,000.00 |
Funds Obligated to Date: |
FY 2021 = $16,000.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
1 UTSA CIR SAN ANTONIO TX US 78249-1644 (210)458-4340 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
One UTSA Circle San Antonio TX US 78249-1644 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | Software & Hardware Foundation |
Primary Program Source: |
01002122DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Software applications with Graphical User Interfaces (GUIs) have become essential in people's daily lives, and sufficient testing is a necessity to ensure their quality. When performed manually, GUI testing is a costly and tedious process requiring many human testers to explore the user interface and check whether the output is as expected. In contrast, existing automated testing techniques are less effective due to the lack of availability of domain knowledge that human testers typically possess. In this project, the investigators will explore the reuse and migration of manual GUI tests, an alternative route to complement existing automatic GUI testing research. The intuitive observation behind the project is that developers tend to use similar GUI designs in different platform versions of a same application or different applications within the same domain. Therefore, it is possible to reuse the exploration sequences, input values, and expected output with proper adaptations taking into account the subtle implementation differences between applications. The project is expected to enhance the coverage and productivity of GUI-testing processes, leading to GUI applications with higher quality and fewer defects. Additionally, the incorporated training and education activities will provide opportunities for participants to acquire research experience and become highly qualified researchers and practitioners.
In this project, the PIs are going to answer the research question: whether and how existing GUI tests can be reused in automatic GUI testing with necessary adaptation. In particular, the investigators will work on the generation of GUI-code embeddings to represent the semantics of GUI views and develop novel GUI-view mapping techniques to map GUI views among different applications. The investigators will also study how input-value constraints and event-sequence constraints in existing GUI tests can be extracted as domain knowledge, how such knowledge can be translated across platform and application boundaries, as well as how the translated knowledge can be incorporated into the automatic GUI-test generation process of the target application. Moreover, the investigators will develop techniques to identify the potential reusability of existing test oracles based on measuring their fitness with the new context, and techniques to create new test oracles by summarizing common behaviors of software applications in the same domain. The findings of this project are intended to shed light on the more general problem of reusing and migrating any test cases such as unit tests and integration tests, as well as the solution to the open problem of creating meaningful test oracles.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.